Switch branches/tags
Nothing to show
Find file History
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
..
Failed to load latest commit information.
data
renderer
video
README
SensorCallbackReceiver.py
actuators.ice
admin.ice
circle.svg
driver-console.config
driver-console.py
pid.py
sensors.ice

README

Motion control with PID-controller and Twiddle algorithm to find PID
parameters (CS-373 unit 5) with real robot.

In this example we illustrate the application of PID controller to
automatically drive the robot as close as possible to the ideal
straight line trajectory.

As a robotics platform this example uses hardware and software we are
developing for our Veter-project: http://veterobot.com . 
Corresponding sources are available here:
https://github.com/andreynech/udacity-cs373

In the original CS-373 Unit 5 examples, the goal was to drive the
simulated robot as close as possible to the X axis. Vertical deviation
from X axis (Y coordinate) was treated as an error and used as input
for PID controller. Twiddle algorithm was introduced as a way to find
P, I and D parameters.

In the real robot case there are three main problems:

1. what is the desired (ideal) path

2. how to measure the deviation from it

3. how to implement twiddle algorithm with real robot in such a way
that reasonable amount of time (for example, less then 30min.) will
required to discover good PID parameters. It should be possible to
conduct experiment within limited space (for example, typical office
room) without hitting walls and without manual intervention during the
run.

To address these problems, we decide to rely on the available on-board
camera. The idea is to place well distinguishable markers (black
circles on the white background) on the opposite walls in the room. We
decide to use available Python interface to OpenCV library to detect
the markers. In particular we use cv.HoughCircles() function to detect
circles. To improve detection performance, before applying
HoughCircles, the original image is converted to the gray-scale, then
Canny edge detection algorithm is executed followed by Gaussian
smoothing. The result of these steps is the location of the center and
radius of the circle in the cameras field of view. This information is
used to:

1. Define the "ideal" path. The goal for the robot is to drive direct
towards the center with as little deviations as possible. This is what
we consider as the ideal straight-line trajectory.

2. Deviation is measured as distance between the detected circle
center and middle (320 pixels) of the camera's frame (640x480 RGBA
bitmap). The goal is to control the robot in such a way, that detected
circle center is permanently kept in the middle of the frame.

3. The twiddle algorithm implemented (copy/pasted) exactly as in the
original Unit 5 assignments. To prevent the robot from completely
driving away when trying bad PID parameter we use the following
strategy. First, we try to drive towards detected marker with current
PID parameters. For each camera frame we measure the error as
deviation of the circle center from the middle of the frame. If, the
circle goes out of the camera field of view, then there is no circle
detected and we set detected radius to zero. At each iteration we
react at this condition by stopping using PID controller and start
on-place rotation until the marker will come back in the field of
view. Then we switch back to the PID-controlled movement towards the
marker. As we approach the marker, the radius of the circle will grow
up. We define the limit of 100 pixels as indication that we are close
to the wall (where the marker is placed). As soon as we reach the 100
pixel radius limit, we again start on-place rotation sequence until we
detect the marker with visible radius less then 100 pixels. This
marker will be the one on the opposite wall. After that, we continue
PID-controlled movement towards newly detected marker.


We have uploaded two videos related to this example to YouTube:

* The first one is http://youtu.be/Qb91radWx6s . This video
  illustrates how the robot behaves at the early stages of the twiddle
  algorithm. The small picture in picture shows the view from "driver
  cockpit" with image processing results. The robot movements are not
  smooth, sometimes they are going into completely wrong direction
  which leads the marker out of the camera field of view. It in turn
  results in the rotation of the robot to find the next marker. This
  behavior is normal for twiddle algorithm because it tries different
  parameters some of which are not feasible.

* The second one is http://youtu.be/bSEoQOExiMw . This video
  illustrates the final run after twiddle algorithm has found good PID
  parameters. It could be seen, that the robot moves straight forward
  towards the marker, does not loose it and properly reacts on small
  deviations.