Tyler Folsom edited this page Aug 17, 2016 · 3 revisions

#Elcano Technical Design #Preface #Introduction ##Scope The Elcano Project is developing low-cost hardware and software kits to convert any vehicle to self-drive. We are concentrating on recumbent tricycles, since that produces a real people mover at a total cost of under $5000. Our kits could also be used in full-sized cars or toy RC cars.

The kits are based on Arduino microcontrollers, and allow you to connect them robustly into a compact, low power package. The open source software is written in C++ and can run on many other platforms. The basic package uses several processors:

  • C2 – Dual control: low level vehicle control, either from the driver or the AI.
  • C3 – Pilot: Detects obstacles and feeds settings for the next path segment to C2.
  • C4 – Path Planner: Computes the best route from current location to destination.
  • C5 – Obstacle detection from sonars.
  • C6 – Navigator: Reads GPS, INU, Odometer, Compass etc. to get best position estimate.
  • C7 – Vision: Locates certain features of interest (Raspberry Pi).

Processing is supported by a set of custom circuit boards:

  • MegaShieldDB – Plugs into C2 and brings out signals on robust cables, making the wheels spin and steer.
  • MegaShieldTrio – Houses C3,  C6, INU and GPS.
  • Sonar – Houses eight sonar range finders.
  • RC – Interface to a standard model airplane/car radio controller; passes signals to C2.

The low level processor is C2, which is concerned with the specific vehicle. Other processors form the high-level Artificial Intelligence.

The MegaShieldTrio board originally held three processors. However the C4 mapping processor software has outgrown the original Arduino Micro, and is now hosted on an Arduino Mega.  The standard configuration uses an Arduino Mega for C6 and C4, and Arduino Micros for C3 and C5. The Mega SDK is a USB host and will interface to an Android smart phone.. The main communications link between processors is serial.

The C7 vision processor may be any suitable platform (such as Raspberry Pi), and communicates with C6 over USB at low-bandwidth. The link is restricted to text with images staying on the C7 processor. The C7 processor could be quite extensive, and incorporate V2V, I2V, radar and Lidar. The system will work without any C7 processor. ###Goals

  1. Use open source code and standardized hardware design (https://github.com/elcano/elcano; ) to routinely build vehicles capable of running Robo-Magellan (http://www.robothon.org/robothon/robo-magellan.php) at speeds of 20-30 mph (35-50 kph).
  2. Market and support a hardware and software solution for low-cost automated vehicles.
  3. When several vehicles have been built, demonstrate communication and control systems capable of operating the vehicles as a personal rapid transit (PRT).
  4. Start a commercial company to build high quality urban transportation systems based on the hobbyist proof-of-concept.
  5. Enable a post-automotive urban transportation system based on vehicles that weigh less than the riders and are powered by renewable energy.

#System Overview Over 90% of traffic accidents are caused by driver error; the safety potential of self-drive is well understood.  When traffic accidents become rare, a motorcycle is almost as safe as an SUV.   Vehicle weights could fall to the point that pod-cars weighing less than the riders are the preferred choice in the city. Since 65% of U.S. vehicle miles traveled (VMT) are urban, the ramifications are enormous. An aerodynamic ultra-light vehicle that avoids stop-and-go needs only one-tenth the energy of an automobile; a 25 pound battery would suffice. Light batteries can be easily swapped, eliminating range anxiety. A bank of batteries can be recharged when the wind blows and the sun shines. Fossil fuel demand, pollution and green house gas production could fall dramatically.

For most people, transportation automation is rocket science.  The Elcano Project aims to make self-drive real for students and hobbyists, and build a popular demand to go ahead with traffic automation. The technology is here; laws and policies to take advantage of it are not.

An isolated autonomous car can improve safety, but the other benefits require choreographing road users; when done right, highway capacity goes up three times, and congestion mostly disappears. If manual and automated traffic were allowed to mix, the manually driven cars would snarl up the automated lane; thus there needs to be separated lanes. A lane set apart for automated vehicles looks a lot like Personal Rapid Transit (PRT), a technology that has been around for more than 40 years. Today PRT systems are in operation; other automated road systems are only at the testing phase.

When an automated vehicle is in a reserved lane, the sensors get simpler and less expensive — no need for lidar, radar or extensive machine vision. The Elcano Project provides a blueprint for building your own experimental automated vehicle using electronics and sensors costing under $2000.  A tricycle with an electric helper motor under 750 Watt and top speed under 20 mph is legally a bicycle, and thus street-legal without license, registration or insurance.

##System Characteristics The Elcano Project is developing low-cost hardware and software kits to convert any vehicle to self-drive. We are concentrating on recumbent tricycles, since that produces a real people mover at a total cost of under $5000. Our kits could also be used in full-sized cars or toy RC cars.

The kits are based on Arduino microcontrollers, and allow you to connect them robustly into a compact, low power package. The basic package uses several processors:

  • C2 – Dual control: low level vehicle control, either from the driver or the AI.
  • C3 – Pilot: Finds path around obstacles and feeds next path settings to C2.
  • C4 – Path Planner: Computes the best route from current location to destination.
  • C5 – Obstacle detector.
  • C6 – Navigator: Reads GPS, INU, Odometer, etc. to get best position estimate.
  • C7 – Vision: Locates certain features of interest.
  • C8 – Simulator interface: Replaces sensor inputs with outputs of PC simulator.

Processing is supported by a set of custom circuit boards:

  • MegaShieldDB – Plugs into C2 and brings out signals on robust cables, making the wheels spin and steer.
  • MegaShieldTrio – Houses C3, C4, C6, INU and GPS. On version 2, there will be only two processors.
  • Sonar – Houses eight sonar range finders, covering 180 degrees, plus rear.
  • E-stop – Remote E-stop and joystick for operator control.

The heart of the system is the flexible MegaShieldTrio board. Version 1 used an Arduino Mega SDK for C6, and Arduino Nanos for C4 and C3. In version 2, C6 and C4 are Arduino Megas and C3 is an Arduino Mini. The Mega SDK is a USB host and will interface to an Android smart phone. Shields can be plugged into C6 and C4. The MegaShieldTrio has plugs for odometers and two sonar boards. The main communications link between processors is serial, and is supported by the Elcano_Serial library.

The C7 vision processor may be any suitable platform, and communicates with C6 over USB. The C7 processor could be quite extensive, and incorporate V2V, I2V, radar and Lidar. The system will work without any C7 processor.

A PC based simulator is a future option. It could use USB to communicate with C8; C8 then generates all the sensor signals for the AI. These include a simulated video camera to C7. C8 also supplies the signals that would come from GPS, INU, Odometry and Sonars.

##System Architecture

###Actuators A1: Traction (Drive) Motor. Controls motor on rear wheel.
Input: Signal from motor controller, which is a standard electric-bike controller. This controller had been referred to as C1, but it has no available code.

A2: Brake Motor. Controls disk brakes on left and right front wheels. Brakes are mechanically linked to operate in tandem. For fail-safe, it is desirable brakes are normally ON; an active signal would be required to release them. Because of the mechanical constuction of bicycle brakes, actual brake behavior is normally OFF. Input: PWM Signal from low-level controller (C2)

A3: Steering Motor. Turns left or right. If no signal, wheels should be locked to a straight ahead position.
Input: PWM Signal from low-level controller (C2)

###Sensors Most sensors go to the Navigator module (C6). Sensor output may be mediated by a dedicated controller before being sent to the Navigator.

S1: Wheel Odometry. A magnetic pickup sends a signal for each revolution of the wheel. If finer resolution is required, there may be several magnets on the rim. Goes to C2 and C6. In version 1, processing was done by C6. In version 2, processing is done by C2 and sent over serial to C6.

S2: Angle of front wheels. Goes to C2.

S3: Digital Compass.

S4: Smart camera. The C7 Visual Data Management processor and camera reduces the video stream to a few useful items, which include lane following, cone recognition and obstacle avoidance.

S5: Commanded and planned speeds and steering angles from Pilot (C3).

S7: GPS. GPS is not accurate enough to be the primary position indicator. GPS will sometimes be unavailable. The navigation system must be designed to function for extended periods of GPS unavailability.

S8: Landmark recognition. Certain landmarks may be provided to aid navigation. This function may be provided by the Smart Camera (S4) or may be a separate camera.

S9: IMU Inertial Measurement (or Navigation) Unit.

S10: Brake feedback. Not implemented.

S11: Proximity sensors C5 has an array of sonar range finders.

###Input Files The 2007 DARPA Urban Challenge used RNDF and MDF files. Elcano files are similar. RNDF (Route Network Definition File). This is a digital map of all roads in the area where the vehicle will be operating. The camera will locate road edges or lane markers and the vehicle will follow them. Thus location can be determined primarily from odometry.

MDF (Mission Definition File). These are latitudes and longitudes that the vehicle is required to visit. MDF waypoints may include approximate cone positions and final destination.

Initial position. Not yet implemented. Specifies the starting location and orientation. Velocity is zero. If this is a file, it is read by C4 (Path Planner) and passed to C6 (Navigator). If it is user input, it is read by C6 (Navigator).

###Controller Modules Each module conceptually runs on its own micro-controller. Implementation may combine one or more modules on the same micro-controller. There is no operating system, file system or disks. Modules may communicate with a host computer for software download.

C1: Electric Bike Controller. Inputs: analog voltage from C2 giving throttle. Output: Move the hub motor.

C2: Low Level Controls Inputs: from C3: Serial command giving desired wheel speed and turn angle. From Radio Control (RC) unit: pulse width signals giving speed, turn, e-stop or other information. From Joystick: driver Drive, Brake and Steer; switches tell whether of not to use cruise control. Outputs: to A1 (analog to Drive motor), A2 (PWM to Brake motor), A3 (PWM to Steering motor),

C3: Pilot. Output: to C2: Serial command giving desired wheel speed and turn angle. Output to C5: Request for obstacle information. Input from C6 via C4: Current position, orientation, speed and acceleration. Input from C4: Desired route as Bezier or Hermite cubic curve segments. Input from C4: Desired speed profile. Input from C5: Range to obstacles in various sectors.

C4: Path Planner. Output to C3: Desired route and speed curves and whether it exits a road or changes lane. This module reads files from an SD card. Input: RNDF, MDF and initial position files. nput from C6: Position, orientation, velocity and acceleration.

C5: Obstacle Detection. Input from S11: Ranges to obstacles. Input from C3: Request for obstacle information. Output to C3: Range to obstacles in various sectors..

C6: Navigator. Fuses all position estimates with dead reckoning. Output to C3, C4, C7: Location, orientation and velocity. Output to C7: Expected cone position. Inputs from S1, S2, S3, C5 (visual odometry), S5, S6, S7 and S8.

C7: Visual Data Management. Passes visual cone detection and lane following information from S4. Output: text information extracted from camera (S4). Output to C6: Deviation from lane following. Input from C6: Expected cone position. Output to C6: Location of cone.

#System Context ##Communication requirements. At initialization to all processors: Latitude and longitude of origin. (All other positions are in meters or cm). Position of destinations. Starting position and orientation of trike.

Processors:

  • C2: Low level control
  • C3: Pilot
  • C4: Planner
  • C5: Obstacle detection
  • C6: Localization
  • C7: Vision

###Main communications paths: Serial C7 ↔ C6 → C4 → C3 → C2 → C6.

GPS ↔ C6

IMU ↔ C6 via serial or I2C

GPS → IMU

###Other paths: Interrupt: Speedometer (wheel click) → C2, C6

Analog Input: Steer angle → C2

Obstacles via SPI: C5 front → C3; C5 rear → C3

Camera → C7

Map on SD card ↔ C4

C2 → Speed, brake, and turn actuators.

Data originates (or is formatted from another form) as:

  • C7: Position of cone in image; probability that cone is visible in image.
  • C6 → C7: Expected position and size of cone in image.
  • C2: ActualSpeed, ActualTurnAngle
  • C6: Best estimate of current position and orientation
  • C6: Best estimate of destination position
  • C5: Distances to obstacles and angular orientation and width of the sensors
  • C4: Sequence of curve segments for next part of desired path. Each segment has starting and ending positions, tangents, recommended speeds, and path widths.
  • C3: CommandedSpeed, CommandedTurnAngle.

Serial Commands

Documentation/CommunicationsRequirements.html

Documentation/SerialCmd.html

There are two formats for the keyword. Those starting with '$' are comma separated values, followed by a checksum. Commands starting with a character other than '$' have arguments grouped by braces into keywords and values. The former may be used with instruments; the latter is the format used for interprocessor communication.

###DRIVE GameBots serial command format allows replacement by the USARSim simulator. Since USARsim is no longer available, the Gamebots format has been modified:

  1. DRIVE {Speed ComandedSpeed} {Ang ComandedSteerAngle}

Where: {Speed ComandedSpeed} ComandedSpeed is an intege giving the speed for the rear wheel in centimeters per second. The value is the speed, in centimeters / second. {Ang ComandedSteerAngle} ComandedSteerAngle is an integer that specifies the steer angle of Elcano’s front wheels. The value is the absolute steer angle, in degrees.

Example: DRIVE {Speed 259}{FrontSteer -3}

###SENSOR A similar format will be used for sensed information from C6.

_2. SENSOR {Speed ActualSpeed} ActualSpeed gives the speed for the rear wheel. The value is the speed, in centimeters per second. Other sensors are:

_3. SENSOR {Ang Deg} Steering angle of the front wheels, in degrees times. 0 degrees is straight ahead, small positive numbers are degrees to the right. Negative numbers (mod 360) are degrees to the left.

_4. SENSOR {Pos EPosMeters,NPosMeters} {Br Deg} // Best estimate of vehicle position, fused from all sensors. The East and North positions are relative to the origin. Bearing tells which way the vehicle is pointing.

###OTHER _5. GOAL {Num n} {Pos EPosMeters,NPosMeters} {Br Deg}

5a. GOAL {Num n} {Pos EPosMeters,NPosMeters} {Br Deg} {Prob n}

_6. SEG {Num n} {Pos EPosMeters,NPosMeters} {Br Deg} {Speed SegSpeed}

Goal gives the positions of the cones, which may be updated from visual information. The next goal is always {Num 1} If a bearing is present, it is the desired direction of approach to the goal.

The localization processor passes messages 4, SENSOR and 5, GOAL to the vision processor. The vision processor then computes the expected position of the cone in the image. After processing the image, it will compute an updated cone position, including the probability that a cone is present in the image. It passes message 5a back to the vision processor.

Segment 1 is the next desired position from the present position. These are followed by Segments 2, 3, 4, 5. Farther segments would not be transmitted. The speed on a segment is the recommended speed, taking account of conditions and turning radius.

###GPS NMEA Output sentences $GPxxx. These include $GPRMC, $GPVTG. $GPGGA, $GPGSA and $GPGSV. See http://www.adafruit.com/datasheets/GlobalTop-FGPMMOPA6H-Datasheet-V0A.pdf Communication to the GPS receiver uses NMEA PMTK packets such as $PMTK220, $PMTK314 and $PMTK605. See http://www.adafruit.com/datasheets/PMTK_A08.pdf

###Maps Format is similar to NMEA output. 'xx' means the character '' followed by a checksum. $MAPLS,Nxx Requests processor N to list the names of all maps. $MAPTX,N,count,num,namexx Processor N transmits the name of map num of count. Count is the number of maps on the processor. $MAPSL,N,namexx Request processor N to send contents of map 'name'. $MAPLN,N,count,num,dataxx Processor N sends line num of count for the requested map. 'Data' consists of node,latitude,longitude,PositionEast,PositionNorth,Link1,Link2,Link3,Link4,Dist1,Dist2,Dist3,Dist4 Where node is the name or number of the point. latitude and longitude are in decimal degrees. Positions are in meters Links are other nodes that are directly linked to this node. If there are more than four, use two different names for the present node, but separate them by zero distance. If there are fewer than four, that field is blank. Dist are distances in meters along the road to a node.

Example data: 8,47.760342,-122.189784,71.903,154.917,2,7,15,,268.441,142.961,269.358, 15,47.757929,-122.189467,95.599,-113.397,8,16,18,,269.358,26.369,182.216, 13,47.621921,-122.348628,170.313,69.008,14,,,,18.578,,,,

The maps themselves are internal to C4 and its SD card. They do not have to be transmitted. C4 needs to have additional information about the route. From this information, it will produce the next portion of the drive path, which will be transmitted to C3.

The map for the SD card must contain information about the links between nodes. Each link will have the following information:

$LNKLST,startNode,endNode,numberOfSegments,segmentIndex1,...,segmentIndexN

$LNKSEG,index,nextIndex,PositionEast,PositionNorth,bearing,width,speed

#System Design

##Naming Conventions Magellan did not sail around the world, since he died in the Philippines. The first people to circumnavigate the globe were 18 members of Magellan's crew, under the leadership of Juan Sebastian Elcano (http://en.wikipedia.org/wiki/Juan_Sebastián_Elcano).

##Software Development Tools Arduino IDE: ARDUINO 1.6.9 The Arduino Software (IDE) to write the C/C++ code that will compile and run on the Arduino boards to control the trike.

GitHub Used for source control and managing project files.

Slack Used for inter team communication.

Kerika Used for managing the scrum boards and various tasks.

#Component Description ##Component Identifier ###Software Code is provided for each of the micro-controllers in the vehicle. These are .ino files for the Arduino environment, but they are really just text files of C code. Thus they should run on almost any other machine, as long as the Arduino library routines are provided. The micro controllers are:

  • C1: Traction, throttle control

This unit is the purchased e-bike motor control unit used to send battery power to the hub motor

  • C2: Dual mode control

Decides whether to use C3 commands RC, or joystick commands to control the vehicle.

  • C3: Pilot

Presents C2 with a segment of the path

  • C4: Path Planner

Reads a map and produces a route to the destination

  • C5: Obstacle Detection

Reports if there is anything in the way

  • C6: Navigator

Reads all the sensors and figures out vehicle location

  • C7: Visual Data Management

Filters smart camera output for lane following, cone detection and obstacle detection

#Cost Estimate ##Bike Bill of Materials (vehicle #1 as of 2011) Documentation/Elcano_BOM.html Documentation/Elcano_BOM.xls

This comes to about $3,700 in materials.

##Sonar Bill of Materials (2015) Documentation/Sonars_BOM.xls

Clone this wiki locally
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.
Press h to open a hovercard with more details.