Skip to content
No description, website, or topics provided.
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
NeuralNet oneless FC layer, one more conv layer. Mar 17, 2018
analysis adding VizGuidedBackprop Oct 15, 2017
arduino imu logging in sep arduino Sep 27, 2017
cloud cloud volume id for shared data update Oct 14, 2017
postprocess
warehouse_sim Simulator of Oakland warehouse racetrack in Unity. First checkin, no … Sep 27, 2017
.gitignore
CarDiagram.jpg Added car wiring diagram Jan 22, 2017
camera.py Fixes from race day - tensor flow related. Sep 27, 2017
camera_test.py
config.py Random steering override when training, fix for stuck override. Mar 18, 2018
debug_message.py
key_watcher.py
key_watcher_testing.py
main_car.py
manual_throttle_map.py fix in manual_throttle_map Nov 6, 2016
pylama.ini
readme.md merged with master Sep 27, 2017
requirements.txt fix requirements.txt Oct 14, 2017
sync_remote_carputer_training.py
viz_guided_backprop.py Shake-shake regularization, 64x32 input res. better grad viz. Mar 11, 2018
warehouse_sim.jpg hack-fixed RC packet drop bug. Cleaned up softmax code. Now regression. Sep 27, 2017

readme.md

Carputer

Carputer is a 1/10th scale self driving car. It drives based on video camera and speedometer inputs. It uses a neural network to drive. The camera image and speedometer get sent to the neural network and the network outputs steering and throttle motor controls for the car.

Since the car uses a neural network for driving, it has to be trained. The training process is basically driving it around a track about 20 times or so using the car's radio control. During that driving, we record all of the radio control inputs along with the video and speedometer data. Once we have that, the neural network can learn how to mimic our driving style by outputting steering and throttle based on the video and speedometer.

The process is to record, train, and then run autonomously, as seen in the steps below.

This is an experimental setup, so it's not super-clean code or hardware.

Recording pipline

  1. Turn on ESC, RC controller. Plug in battery, USB. Start switch to off.
  2. Run InsomniaX and disable lid sleep and idle sleep.
  3. activate the virtualenv: source /path/to/venv/bin/activate
  4. run a script to drive and record training data: python main_car.py record -- this will let you have manual control over the car and save out recordings when you flip the switch

Run autonomously

  1. Turn on ESC, RC controller. Plug in battery, USB. Start switch to off.
  2. Run InsomniaX and disable lid sleep and idle sleep.
  3. activate the virtualenv: source /path/to/venv/bin/activate
  4. run a script to let tensorflow drive: python main_car.py tf -- when you flip the switch, you will lose manual control and the car will attempt to drive on its own
  5. for autonomous kill switch: pull throttle and turn the steering wheel
  6. to revive autonomous mode, hit the channel 3 button (near the trigger)

Training pipline

  1. convert TRAINING images to np arrays: python NeuralNet/filemash.py /path/to/data (Can be multiple paths)
  2. convert TEST images to np arrays: python NeuralNet/filemash.py /path/to/data --gen_test (Can be multiple paths)
  3. train a model: python NeuralNet/convnet02.py. Train for minimum 1500 iterations, ideally around 5000 iterations.
  4. use this model to drive the car (see above)

Analysis

  • for training info, see debug.html -- reminder: < 7 is right, > 7 is left
  • run analysis/make_video.py for debug videos
  • use analysis/plot_vs_time.py to view telemetry data

Hardware TODOs

  • Fix the radio control dropped signal error
  • Get the TX1 working
  • Get the IMU recording data

Software TODOs

  • Fix keepalive on Arduino
  • Look into remote SSH type software so we don't have to keep popping open the car.

Hardware setup

Updates - We no longer use the IMUs and we're no longer trying to run the NVidia TX1 computer. Macbook works better. Wiring diagram

Simulator

Work in progress, of course. The simulator runs in Unity. I didn't check in the lighting files because they are big, but if you build lighting, it should look like this... Unity sim

You can’t perform that action at this time.