This repository shows a basic implementation of the ROS ar_track_alvar library on Ubuntu 18.04 LTS using ROS Melodic and Logitech C525 USB Camera.
- ROS Melodic Installation
- Creating and Configuring a ROS Environment
- Installing AR Tag Tracking library
- Running Launch Files
Add the ROS repository to your Ubuntu/Debian sources.list:
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
Add the keys for accessing it:
sudo apt-key adv --keyserver 'hkp://keyserver.ubuntu.com:80' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654
At this point you should have an output like this:
Executing: /tmp/apt-key-gpghome.WudTyznLyJ/gpg.1.sh --keyserver hkp://keyserver.ubuntu.com:80 --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654
gpg: key F42ED6FBAB17C654: public key "Open Robotics <info@osrfoundation.org>" imported
gpg: Total number processed: 1
gpg: imported: 1
Update the packages list:
sudo apt update
At this time, the output should be like the following:
Hit:1 http://eu-west-1.ec2.archive.ubuntu.com/ubuntu bionic InRelease
Hit:2 http://eu-west-1.ec2.archive.ubuntu.com/ubuntu bionic-updates InRelease
Hit:3 http://eu-west-1.ec2.archive.ubuntu.com/ubuntu bionic-backports InRelease
Get:4 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB]
Get:5 http://packages.ros.org/ros/ubuntu bionic InRelease [4669 B]
Get:6 http://packages.ros.org/ros/ubuntu bionic/main amd64 Packages [569 kB]
Fetched 662 kB in 2s (406 kB/s)
Reading package lists... Done
Building dependency tree
Reading state information... Done
3 packages can be upgraded. Run 'apt list --upgradable' to see them.
There are many options when it comes to ROS Installation, i.e., ROS Base, ROS Desktop and ROS Desktop Full. In this case we are going to choose the ROS Full Desktop option.
ROS Desktop Full installs everything of ROS Desktop option plus 2D/3D simulators and 2D/3D perception.
Install ROS Desktop Full:
sudo apt install ros-melodic-desktop-full
After running the command, you should have an output like this:
0 upgraded, 485 newly installed, 0 to remove and 3 not upgraded.
Need to get 234 MB of archives.
After this operation, 1132 MB of additional disk space will be used.
Do you want to continue? [Y/n]
Setting up ros-melodic-nodelet-topic-tools (1.9.16-0bionic.20190601.015001) ...
Setting up ros-melodic-nodelet-core (1.9.16-0bionic.20190601.015433) ...
Setting up ros-melodic-roswtf (1.14.3-0bionic.20190601.014658) ...
Setting up ros-melodic-ros-comm (1.14.3-0bionic.20190601.015500) ...
Setting up ros-melodic-ros-core (1.4.1-0bionic.20190601.015718) ...
Setting up ros-melodic-ros-base (1.4.1-0bionic.20190808.193524) ...
Processing triggers for libgdk-pixbuf2.0-0:amd64 (2.36.11-2) ...
Processing triggers for libc-bin (2.27-3ubuntu1) ...
user@computer:~$
Important! - Troubleshooting
Note that due to the differences on the Linux distributions that support ROS Melodic, you could have some problems with the next section such as the following error:
user@computer:~$ rosdep: command not found
If that's the case, dont worry, just run this few commands:
sudo apt-get install python-pip
sudo pip install -U rosdep
or use the recommended method on Ubuntu:
sudo apt-get install python-rosdep
ROS has a client which is in charge of managing commands and dependencies, and it is called rosdep
.
Initialize rosdep
:
sudo rosdep init
The output would be the following:
Wrote /etc/ros/rosdep/sources.list.d/20-default.list
Recommended: please run
Update rosdep
:
rosdep update
The output for running the previous command:
reading in sources list data from /etc/ros/rosdep/sources.list.d
Hit https://raw.githubusercontent.com/ros/rosdistro/master/rosdep/osx-homebrew.yaml
Hit https://raw.githubusercontent.com/ros/rosdistro/master/rosdep/base.yaml
Hit https://raw.githubusercontent.com/ros/rosdistro/master/rosdep/python.yaml
Hit https://raw.githubusercontent.com/ros/rosdistro/master/rosdep/ruby.yaml
Hit https://raw.githubusercontent.com/ros/rosdistro/master/releases/fuerte.yaml
Query rosdistro index https://raw.githubusercontent.com/ros/rosdistro/master/index-v4.yaml
Skip end-of-life distro "ardent"
Skip end-of-life distro "bouncy"
Add distro "crystal"
Add distro "dashing"
Add distro "eloquent"
Skip end-of-life distro "groovy"
Skip end-of-life distro "hydro"
Skip end-of-life distro "indigo"
Skip end-of-life distro "jade"
Add distro "kinetic"
Skip end-of-life distro "lunar"
Add distro "melodic"
Add distro "noetic"
updated cache in /home/ubuntu/.ros/rosdep/sources.cache
We have ROS and the dependencies manager installed. Let’s configure our environment. This is a very important step, once we have it done, working with ROS will be smooth.
ROS is installed at /opt/ros/<distro>
(in this case /opt/ros/melodic
). In order to have ROS commands available, it’s needed to source the shell file inside of the installation folder. This is done like the following:
source /opt/ros/melodic/setup.bash
But.. considering we want to have it available in every terminal we open, we use to have a “shortcut”, which is adding this command to the file "/home/<user>/.bashrc
". The .bashrc
file is called every time a new terminal is opened, therefore, we won’t need to source
ROS setup, since we have the instruction in this file. In order to add the command to the file, you can edit it manually using an editor of your preference or just execute the command below:
echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc
Now create a ROS Workspace:
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/
catkin_make
Additionally, if you look in your current directory you should now have a 'build' and 'devel' folder. Inside the 'devel' folder you can see that there are now several setup.*sh files. Sourcing any of these files will overlay this workspace on top of your environment. To understand more about this see the general catkin documentation: catkin. Before continuing source your new setup.*sh file:
source devel/setup.bash
To make sure your workspace is properly overlayed by the setup script, make sure ROS_PACKAGE_PATH environment variable includes the directory you're in.
echo $ROS_PACKAGE_PATH
The output would be the following:
/home/youruser/catkin_ws/src:/opt/ros/melodic/share
Install the library:
sudo apt-get install ros-melodic-ar-track-alvar
Install the Kinect driver:
sudo apt-get install ros-melodic-freenect-launch
Now that we have the package, let’s make our own AR Tag! To do that, you will either want to make a folder to keep all of your tags, or navigate to an existing one you want to output the images to. Once there, you will run:
rosrun ar_track_alvar createMarker 0
You should see a new file named MarkerData_0.png
in your current directory. You can easily open the image from your terminal by running:
eog MarkerData_0.png
You should get an image that looks like the following pop up on your screen. It should also be noted that AR Tags have different patterns based off of the number you have requested. A zero will always look like a zero, a one will always look like a one and so on. This is really useful if you tag an object for a robot to interact with with a specific number. So you can reliably, and repeatably select the object again.
Another note on printing. You can select different sizes to create an AR Tag. SO if you wanted one really big one or several small one depending on your application, you can have that! It’s just important to note what size you printed it, because that’ll be important later. The default size that these tags are printed at are 9 units x 9 units. So for example. If you wanted to output a 3 tag that was 5cmx5cm, the syntax would be:
rosrun ar_track_alvar createMarker -s 5 7
Important!
Due to differences in how you can format your printer/setup an image print or if you take a bunch of images and put them on a single page, it can cause some changes in the actual size of the tag. I would suggest that you still measure the printed tag just to be absolutely sure of the size of the tag, because that will be important to get an estimate of the tag’s position with respect to the camera. A tiny tag that isn’t known to be tiny to the software could think it’s really far away.
Clone the repository into the catkin_ws
workspace:
git clone https://github.com/alainware/ar-track-kinect-ros
- In order to be able to use your USB camera with ROS, you need to install the following package called usb_cam:
sudo apt install ros-melodic-usb-cam
- For listing media devices, install the following package called v4l-utils:
sudo apt-get install v4l-utils
- Test the package typing the following command on the terminal:
v4l2-ctl --list-devices
- For managing media devices and file formats, install the following package called ffmpeg:
sudo apt install ffmpeg
- Test the package typing the following command in the terminal:
ffplay /dev/video2
- Additionally, you will need to install the following packages:
sudo apt-get install ros-melodic-video-stream-opencv
sudo apt-get install ros-melodic-image-pipeline
- Finally, update Ubuntu packages:
sudo apt-get update
- Open a terminal in catkin_ws/src
catkin_create_pkg ar_tag_rover std_msgs rospy
- In your custom package
ar_tag_rover
, create a new folder calledlaunch
. Inside, create a file calledcamera.launch
. Copy the code below into it. It is a modified version of the camera.launch file from video_stream_opencv. Note that video_stream_provider may have to be changed to 1 or 2 if you are using an external camera. If you are using a virtual machine, you will need to enable the webcam under Devices>Webcam in the Virtual Box menu.
<launch>
<arg name="camera_name" default="camera" />
<!-- video_stream_provider can be a number as a video device or a url of a video stream -->
<arg name="video_stream_provider" default="2" />
<!-- frames per second to query the camera for -->
<arg name="fps" default="30" />
<!-- frame_id for the camera -->
<arg name="frame_id" default="camera_link" />
<!-- By default, calibrations are stored to file://${ROS_HOME}/camera_info/${NAME}.yaml
To use your own fill this arg with the corresponding url, e.g.:
"file:///$(find your_camera_package)/config/your_camera.yaml" -->
<arg name="camera_info_url" default="" />
<!-- flip the image horizontally (mirror it) -->
<arg name="flip_horizontal" default="false" />
<!-- flip the image vertically -->
<arg name="flip_vertical" default="false" />
<!-- force width and height, 0 means no forcing -->
<arg name="width" default="0"/>
<arg name="height" default="0"/>
<!-- if show a image_view window subscribed to the generated stream -->
<arg name="visualize" default="true"/>
<!-- images will be published at /camera_name/image with the image transports plugins (e.g.: compressed) installed -->
<group ns="$(arg camera_name)">
<node pkg="video_stream_opencv" type="video_stream" name="$(arg camera_name)_stream" output="screen">
<remap from="camera" to="image_raw" />
<param name="camera_name" type="string" value="$(arg camera_name)" />
<param name="video_stream_provider" type="string" value="$(arg video_stream_provider)" />
<param name="fps" type="int" value="$(arg fps)" />
<param name="frame_id" type="string" value="$(arg frame_id)" />
<param name="camera_info_url" type="string" value="$(arg camera_info_url)" />
<param name="flip_horizontal" type="bool" value="$(arg flip_horizontal)" />
<param name="flip_vertical" type="bool" value="$(arg flip_vertical)" />
<param name="width" type="int" value="$(arg width)" />
<param name="height" type="int" value="$(arg height)" />
</node>
<node if="$(arg visualize)" name="$(arg camera_name)_image_view" pkg="image_view" type="image_view">
<remap from="image" to="image_raw" />
</node>
</group>
</launch>
- Next, create the launch file that does the tracking. Again, this is a modified launch file from the ar_track_alvar package. Create a file called
track.launch
in your launch file folder and copy the following code inside it. Note that you will need to set the marker size. This is the length in centimeters of one side of the black part of an AR Tag.
<launch>
<arg name="marker_size" default="6.9" />
<arg name="max_new_marker_error" default="0.08" />
<arg name="max_track_error" default="0.2" />
<arg name="cam_image_topic" default="/camera/image_raw" />
<arg name="cam_info_topic" default="/camera/camera_info" />
<arg name="output_frame" default="/camera_link" />
<node name="ar_track_alvar" pkg="ar_track_alvar" type="individualMarkersNoKinect" respawn="false" output="screen">
<param name="marker_size" type="double" value="$(arg marker_size)" />
<param name="max_new_marker_error" type="double" value="$(arg max_new_marker_error)" />
<param name="max_track_error" type="double" value="$(arg max_track_error)" />
<param name="output_frame" type="string" value="$(arg output_frame)" />
<remap from="camera_image" to="$(arg cam_image_topic)" />
<remap from="camera_info" to="$(arg cam_info_topic)" />
</node>
</launch>
- You might only want to have to launch one file. This launch file simply calls the other two. Name this file as
main.launch
.
<launch>
<include file="$(find ar_tag_rover)/launch/camera.launch" />
<include file="$(find ar_tag_rover)/launch/track.launch" />
</launch>
Run the following command in order to build the package that was created in the previous section:
catkin_make
Navigate to the folder where you created your custom package. You should be able to see a folder named launch
. Navigate into the launch
folder and run the following commands in two separate terminals...
- Terminal 1:
Set the appropriate output frame so that the default
/map
frame maps to thecamera_link
camera output frame:
rosrun tf2_ros static_transform_publisher 0 0 0 0 0 0 1 /map /camera_link
- Terminal 2:
roslaunch main.launch
Open rviz with the following command:
rviz
Add TF to the data visualized on the left.
You're done! :)