Skip to content

camigord/System-Integration-Project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

85 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

UDACITY - System Integration Project


Programming a Real Self-Driving Car for the UDACITY Nanodegree

The Happy Crashers Team:

Image Name LinkedIn email
Camilo Gordillo Camilo Gordillo Camilo camigord@gmail.com
Stefano Salati Stefano Salati Stefano stef.salati@gmail.com
Stefan Rademacher Stefan Rademacher Stefan rademacher@outlook.com
Thomas GRELIER Thomas GRELIER Thomas masto.grelier@gmail.com

Project description

The objective of this project was to write ROS nodes to implement core functionality of the autonomous vehicle system. This included traffic light detection, waypoint planner and control.

Control strategy

The car behaviour is regulated by a FSM with two states: normal driving and braking.

The car drives normally when no traffic light is detected within a range of 100m or when the detected traffic light is green. This is obtained by setting the speed of all waypoints ahead to their default values.

As soon as a red traffic light is detected, the system computes the minimum braking distance to check if a braking is possible. If yes, a braking deceleration - dependent on the current speed and distance of the traffic light - is applied and the speed of all waypoints between the car and the traffic light is set so it gently decrease to zero at the light. All speeds of waypoints ahead of the traffic light are set to zero.

a_{brake} = \frac{v_{car}^2}{2*d_{light}}

v_{i} = \left\{\begin{matrix}  \sqrt{2*a_{brake} * d_{i}} & \textrm{for waypoints before traffic light} \\ 0 & \textrm{for waypoints beyond traffic light} \end{matrix}\right.

It's worth noting this scenario: a red light is detected and the car starts braking gently, if the light turns green while the car is braking, the car switches back to normal driving and accelerates to normal speed. This replicates the usual human behavior of not waiting the last moment to brake but to let go as soon as a red light is seen.

Throttle and brake pedal are then controlled by a PID, tuned with:

Parameter Value
VEL_PID_P 0.8
VEL_PID_I 0.0001
VEL_PID_D 0.01

Traffic Light Classification

We assume that there is only one traffic light color in Carlas field of view at the same time. Thus the detection with the highest score gets selected to identify the traffic light color. If there is no detection with a score higher than 50% probability, the classifier function returns "UNKNOWN". The detection itself is done using a Convolutional Neural Network. We have used the TensorFlow Object Detection API and detection models, that are pre-trained on the COCO dataset. To be exact we have used two different models, the "ssd_inception_v2_coco" and the "faster_rcnn_inception_v2_coco".

Dataset

Our dataset used for the training of the classifier consists of:

  • 280 images from the Udacity Simulator
  • 710 images from the training bag that was recorded on the Udacity self-driving car

All images of the dataset where labeled manually. The dataset had to be converted into the TFRecord format.

Training

We have trained two classifier. One for the simulator and one for the real world. The training was done using the AWS Deep Learning AMI and the following configuration parameters:

  • num_classes: 4 ('Green','Red','Yellow','Unknown')
  • num_steps: 10000
  • max_detections_per_class: 10

Other parameters were used from the sample configuration files provided by the tensorflow team (https://github.com/tensorflow/models/tree/master/research/object_detection/samples/configs).

Validation

We have then validated the trained classifier on 20 "unseen" images. Both classifier (for simulator and real world) have correctly detected and classified all traffic lights on the test images. Here are two examples:

Installation

Native Installation

Docker Installation

[Install Docker](https://docs.docker.com/engine/installation/)

Build the docker container

docker build . -t capstone

Run the docker file

docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone

Port Forwarding

To set up port forwarding, please refer to the [instructions from term 2](https://classroom.udacity.com/nanodegrees/nd013/parts/40f38239-66b6-46ec-ae68-03afd8a601c8/modules/0949fca6-b379-42af-a919-ee50aa304e6a/lessons/f758c44c-5e40-4e01-93b5-1a82aa4e044f/concepts/16cf4a78-4fc7-49e1-8621-3450ca938b77)

Usage

1. Clone the project repository

git clone https://github.com/camigord/System-Integration-Project

2. Install python dependencies

cd CarND-Capstone
pip install -r requirements.txt

3. Build project

cd ros
catkin_make
source devel/setup.sh

Running

Simulator

We have developed two classifiers, the first one is a SSD, the second a RCNN. Simulation works well with both of them.

  • To launch project with SSD classifier, type the following command. This is also used if no model parameter is specified or if the original launch file (that doesn't specify this parameter) is used.
roslaunch launch/styx.launch model:='frozen_inference_graph_simulation_ssd.pb'
  • To launch project with RCNN classifier, type the following command:
roslaunch launch/styx.launch model:='frozen_inference_graph_simulation_rcnn.pb'

Then launch the simulator.

Real world testing

1. Download [training bag](https://s3-us-west-1.amazonaws.com/udacity-selfdrivingcar/traffic\_light\_bag_file.zip) that was recorded on the Udacity self-driving car. 2. Unzip the file

unzip traffic\_light\_bag_file.zip

3. Play the bag file

rosbag play -l traffic\_light\_bag\_file/traffic\_light_training.bag

4. Launch project in site mode

cd CarND-Capstone/ros
roslaunch launch/site.launch model:='frozen_inference_graph_real.pb'

About

Programming a Real Self-Driving Car for the UDACITY Nanodegree

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •