This is the control program for a tracked robot called Kartobot. It uses active sonar and a camera to navigate its surroundings. Listens to high-level remote control from a pc or mobile phone. The goal is for it to be able to survive handling the controls to a 5 year old. It should activly avoid hitting walls even if it is receiving commands to do so.
Build log (requires registration to view pictures there, pictures only here).
Communication
- internal
- with Zircon 4 (robot mainboard with a STM32F437ZGT6) using serial port over USB.
- external
- short range: usb wifi for normal control over tcp/udp.
- long range: xbee connected to Zircon 4.
- ncurses gui over ssh for settings and system status
Program environment
- language: c++
- libraries: ROS
- make utility: cmake
- version control: git (@github)
- Runs on a Parallella
- Vision code will probably run on a pair of Raspberry Pi 2 with cameras
- Code style is similar to standard ROS
Objectives
- Read data from serial port connected to Zircon 4
- Read data forwarded from Zircon 4s XBee
- sensors: sonar, motor status, battery level
- Read data from network (wifi)
- commands from computer/phone
- send status and video
- Create map from sonar data
- avoid hitting walls
- Detect things with camera
- library openCV?
- objects to detect: doors, colored items
- utilize Epiphany coprocessor
Structure
- Zircon 4 communication: get sensor values, send motor commands
- Serial port reader (kbot_bridge)
- Network: Maintain list of connected clients
- Parser: Understand commands. aware where a request came from and respond correctly
- Connection sources: wifi tcp, xbee
- Map
- Mapping: Take sonar pings from Zircon 4 (kbot_bridge) and place on map as an arc (kbot_mapper)
- Navigation: navigate around things
- Drift correction: try to correct for sensor drift over time
- Vision
- Get: Take images
- Detection: Detect objects/lines/patterns
- Memory: Remember objects from last frames?
Libraries