Skip to content
Nathaniel Renegar edited this page Sep 19, 2019 · 34 revisions

Robotics @ Maryland Wiki

Welcome to the Robotics @ Maryland Wiki! This wiki should serve as an introduction to most of the things you’ll need to know if you want to learn and contribute to our software system.

If you’re interested in the electrical team, you can find their repository here.

If you’re interested in the mechanical team, you should check out our website, ram.umd.edu, and come to one of our meetings with your laptop.

Overview

Our robot is powered by a Jetson TX1, running Ubuntu 16.04. We use ROS and program primarily in Python and C++. For our embedded system, we’re currently TI Tiva C running FreeRTOS.

If you didn’t understand most of that, don’t worry. Most of our members didn’t understand most of that before they joined, either. We’ll teach you most of the things you need to know.

Getting Started

If you’re interested in working on our software system, you’ll first need an install of Ubuntu 16.04. There are a couple of ways you can do this. We usually recommend you either:

  • Install a virtual machine, like VMWare (which you can get from TERPware), or VirtualBox. Virtual machine software will let you run an operating system just like another program. Most of our members have found that VMWare runs a bit better, but a few have found that VMware causes their computer to blue screen on install, or that the version on TERPware doesn’t support the most recent Linux kernels as host, in which case you’ll want to use VirtualBox.
  • Run Ubuntu natively, replacing or installing it alongside your current operating system. Running it natively will give you a much smoother experience, but you might run into trouble if you screw something up or we make a decision that requires a lot of backend changes.

For new members, we recommend installing and running everything in a virtual machine. If things go wrong, they’re much easier to fix, and they won’t break anything else on your computer. Whichever you choose, you’ll need an Ubuntu 16.04 image, which you can get here. If you’re new to Linux, make sure to get the ‘desktop’ image instead of ‘server.’

Setting Up Your Virtual Machine

If you’ve chosen to use a virtual machine, you’ll need to install and configure your new Ubuntu install. We’ve found that both VMWare and VirtualBox provide a moderately easy to use install interface, but if you’re stuck, you can find the instructions for VMWare here and VirtualBox here. If you’ve chosen VirtualBox, you’ll also need to install the guest editions. This is a little complicated but isn’t necessary, it just adds some more resolution options to your virtual machine. You can find the guide to install the guest additions here.

Important:

  • Make sure you give your virtual machine more than 2GB of RAM! If you don’t, some things won’t compile correctly.
  • Give your virtual machine a significant amount of your CPU. You’ll be compiling things on it, so you’ll want as much power as you can spare.

Setting Up Our Environment

Once you’ve installed Ubuntu and set it up, you’ll need to configure our repository. If you’re new to git, we strongly recommend you go through a tutorial, like this one, or this one. After you’re comfortable with git, you should open a terminal and run:

$ sudo apt install git
$ git clone https://github.com/robotics-at-maryland/qubo.git

(To open a terminal in Ubuntu, you can hit the Windows/Command key and begin typing “terminal” until you see it)

The vimba camera drivers we use for our main camera are no longer online. Before you run our dependencies script you’ll need to download the drivers here. Once you `vimba.tgz`, run these commands:

$ mkdir -p ~/src/vimba
$ mv <where you downloaded>/vimba.tgz ~/src/vimba

After that, you’ll need to run the install script install_dependencies.bash located in scripts. This script will download and install all the other software and libraries you’ll need to compile our code. This may take a while (check the Troubleshooting section below if you run into problems). To run the script, run the following in a terminal:

$ cd qubo
$ bash ./scripts/install_dependencies.bash

Due to a bug in the current build script, you also need to run:

$ source ~/.bashrc
$ cd ~/catkin_ws
$ catkin_make

Once you’ve run the install script, you should execute catkin_make in the main folder of the repository to compile and build our software. If there are no errors, everything should be installed correctly. From the same terminal, run:

$ catkin_make

Or from a fresh terminal:

$ cd qubo && catkin_make

Once you’ve gotten our software to compile, you can start looking through our code and begin to familiarize yourself with some of the libraries we use. Things you may be interested in looking into might be:

If you’re interested in working on the high level software of the robot, we recommend everyone create a basic ROS publisher and subscriber, and then submit the code to us as a pull request. It’s a great way to learn the basics of the high level system.

Build System

We use ROS catkin to build our software. If you have everything configured/sourced correctly, you should only need to run:

$ catkin_make

in the top level of the repository to build it. If something goes wrong, you may have forgotten to source a couple of files needed to get everything to compile correctly. Make sure your .bashrc has the variables/sources set up by the install script, it should contain (not necessarily in the same order):

source /usr/share/gazebo-7/setup.sh
source /opt/ros/kinetic/setup.bash
source /home/${YOUR USERNAME}/catkin_ws/devel/setup.sh
export GAZEBO_PREFIX=/home/${YOUR USERNAME}/catkin_ws/install
export GAZEBO_RESOURCE_PATH=/share/gazebo-7.0:
export GAZEBO_MODEL_PATH=/share/gazebo-7.0/models:
export GAZEBO_PLUGIN_PATH=/lib:/lib/x86_64-linux-gnu:

Repository Structure

Now that you’ve downloaded and compiled the repository, we’ll discuss how everything is organized.

qubo
├── build
├── devel
├── embedded
├── qubobus
├── scripts
└── src

qubo is the root/top folder of the repository. All of our code is contained in this folder.

build is a directory generated by catkin when it compiles our code. You may not have this when you clone the repository for the first time. If you’re getting weird issues when you compile, you can try deleting this folder to clean up any old build artifacts.

devel contains helpful scripts generated by catkin. You can run $ source devel/setup.bash from the root folder to get tab completion for our specific ROS programs (when you’re in a terminal, hitting the tab key will attempt to auto-complete the line)

embedded contains our embedded code. Catkin doesn’t touch anything here when it compiles the repository, since we use a different compiler for our low-level code.

qubobus contains code that defines a message protocol shared between the embedded system and the high level system. The code in this folder is compiled into some of our code base by catkin. It’s also linked in the embedded folder, where it’s compiled into the embedded system.

scripts contains useful scripts that download and install various things.

src contains all of our high-level code.

ROS

Our high level software uses the ROS framework to pass messages and information between different components of the system. The system is currently split into three parts: planning, vehicle layer, and controls.

The planning subsystem uses Python and rospy to interface with the vision/controls subsystem to determine where our robot is, and what it wants to do next.

The vehicle layer is our ROS interface between drivers and the rest of the system. It receives values from various hardware and interprets/publishes them to ROS topics for the other subsystems.

The controls subsystem takes the information from the vehicle layer, and uses it to figure out our heading/velocity/acceleration. It also takes the desired movement/position form the planning subsystem, converts it to specific motor/thruster commands, and relays that information to the vehicle layer.

All of our code exists as independent ROS nodes. Each node usually has one task, like interaction with one sensor or identifying one kind of object. When you create a new ROS node, you’ll need to specify its dependencies and how to build it in the CMakeLists.txt file that should exist a directory above your source file. ROS uses the CMake build system to build nodes, and CMake uses the CMakeLists.txt file to figure out what it should build. If you’re having issues with things not getting build or missing dependencies, CMakeLists.txt is where you should look first.

Some good ROS concepts to know are

  • Publishers and Subscribers python, C++
  • Services and Clients python, C++
  • ROS Message Definitions msg
  • ROS Service Definitions srv

How Do I Run It?

To start testing things you’ve written with ROS, or just to make sure everything you just built runs, you’ll need to know how to correctly start our system.

If you want to just manually run a few specific nodes, you should first call roscore to start an instance of the main ROS server on your machine. After that, you can either open a new terminal or background the roscore process and run rosrun $package $node, where $package is your package (like vl_qubo, vision, etc) and $node is an executable you’ve defined in your CMakeLists.txt and built (like qubo_ahrs_node, vision_node, etc).

The other, and probably easier option, is to use roslaunch. roslaunch takes a .launch XML file, and parses it to run multiple nodes with set parameters. It will also start/kill roscore in the background, so you don’t have to worry about running it first. All of our current launch files exist in src/launch, and can be used to launch various parts of the system, or the entire thing.

Troubleshooting

Here are some common things to try to resolve problems.

  • Close your terminal and reopen it (or source your ~~/.bashrc~ file again)
  • Give your virtual machine more RAM
  • pip install catkin_pkg rospkg empy
  • Make sure the simulator builds in ~/catkin_ws (run catkin_make in it)

Embedded

The embedded system handles the safety, monitoring, and thruster control of the robot.

Tiva TM4C123G

The main computer and the embedded system communicate to each other over a UART interface using a protocol called Qubobus. The endpoints that implement this communication are called the QSCU(Qubo System Control Unit). Both endpoints use the same code to assemble and disassemble messages from a generic bytestream.

We’re currently using a TM4C123GXL. The system runs on FreeRTOS, which is an open source real time operating system. The toolchain used to compile the code is the GNU ARM Embedded Toolchain. To flash to the microcontroller you can use lm4tools.

To get started, run the scripts/embedded_install_deps.bash file to install all the dependencies required. Ubuntu 16.04 is recommended. We have a book about getting started with FreeRTOS, email software@ram.umd.edu if you want it. Read the embedded/README.md and embedded/guide.md for more information on the Tiva embedded software.