ATOM is a set of calibration tools for multi-sensor, multi-modal, robotic systems, based on the optimization of atomic transformations as provided by a ros based robot description. Moreover, ATOM provides several scripts to facilitate all the steps of a calibration procedure.
If this work is helpful for you please cite our paper:
Miguel Oliveira, Afonso Castro, Tiago Madeira, Eurico Pedrosa, Paulo Dias, Vítor Santos, A ROS framework for the extrinsic calibration of intelligent vehicles: A multi-sensor, multi-modal approach, Robotics and Autonomous Systems, Volume 131, 2020, 103558, ISSN 0921-8890, https://doi.org/10.1016/j.robot.2020.103558.
Bibtex:
@article{OLIVEIRA2020a,
title = "A ROS framework for the extrinsic calibration of intelligent vehicles: A multi-sensor, multi-modal approach",
author = "Miguel Oliveira and Afonso Castro and Tiago Madeira and Eurico Pedrosa and Paulo Dias and Vítor Santos",
journal = "Robotics and Autonomous Systems",
volume = "131",
pages = "103558",
year = "2020",
issn = "0921-8890",
doi = "https://doi.org/10.1016/j.robot.2020.103558",
url = "http://www.sciencedirect.com/science/article/pii/S0921889020303985"}
- How to Use - Quick Start
- System calibration - Detailed Description
- Examples
- Contributors
- Maintainers
Table of contents generated with markdown-toc
Unlike most other calibration approaches, ATOM offers tools to address the complete calibration pipeline:
- Create a calibration package for you robotic system
rosrun atom_calibration create_calibration_pkg --name <your_robot_calibration>- Configure your calibration package - edit the file <your_robot_calibration>/calibration/config.yml with your system information.
rosrun <your_robot_calibration> configure - Set initial estimate - deployment of interactive tools based on rviz that allow the user to set the pose of the sensors to be calibrated, while receiving visual feedback;
roslaunch <your_robot_calibration> set_initial_estimate.launch - Collect Data - Extraction of snapshots of data (a.k.a., collections)
roslaunch <your_robot_calibration> collect_data.launch output_folder:=~/datasets/<my_dataset> - Calibrate sensors - finally run an optimization that will calibrate your sensors:
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=~/datasets/<my_dataset>/data_collected.jsonTo calibrate your robot you must define your robotic system, (e.g. <your_robot>). You should also have a system description in the form of an urdf or a xacro file(s). This is normally stored in a ros package named <your_robot>_description.
Finally, ATOM requires a bagfile with a recording of the data from the sensors you wish to calibrate. Transformations in the bagfile (i.e. topics /tf and /tf_static) will be ignored, so that they do not collide with the ones being published by the robot_state_publisher. Thus, if your robotic system contains moving parts, the bagfile should also record the sensor_msgs/JointState message.
It is also possible to record compressed images, since ATOM can decompress them while playing back the bagfile. Here is a launch file example which records compressed images.
We often use two enviroment variables to allow for easy cross machine access to bagfiles and datasets. If you want to use these (it is optional) you can also add these lines to your .bashrc
export ROS_BAGS="$HOME/bagfiles"
export ATOM_DATASETS="$HOME/datasets"And then you can refer to these environment variables when providing paths to atom scripts, e.g.
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=$ATOM_DATASETS/<my_dataset>/data_collected.jsonand you can also refer to them inside the calibration configuration file
To start you should create a calibration ros package specific for your robot. ATOM provides a script for this:
rosrun atom_calibration create_calibration_pkg --name <your_robot_calibration>This will create the ros package <your_robot_calibration> in the current folder, but you can also specify the folder, e.g.:
rosrun atom_calibration create_calibration_pkg --name ~/my/path/<your_robot_calibration>Once your calibration package is created you will have to configure the calibration procedure by editing the <your_robot_calibration>/calibration/config.yml file with your system information. Here is an example of a config.yml file.
After filling the config.yml file, you can run the package configuration:
rosrun <your_robot_calibration> configure This will create a set of files for launching the system, configuring rviz, etc.
Iterative optimization methods are often sensitive to the initial parameter configuration. Here, the optimization parameters represent the poses of each sensor. ATOM provides an interactive framework based on rviz which allows the user to set the pose of the sensors while having immediate visual feedback.
To set an initial estimate run:
roslaunch <your_robot_calibration> set_initial_estimate.launch Here are a couple of examples:
| Atlascar2 | AgrobV2 |
|---|---|
![]() |
![]() |
| UR10e eye in hand | ... |
|---|---|
![]() |
... |
To run a system calibration, one requires sensor data collected at different time instants. We refer to these as data collections. To collect data, the user should launch:
roslaunch <your_robot_calibration> collect_data.launch output_folder:=<your_dataset_folder>Depending on the size and number of topics in the bag file, it may be necessary (it often is) to reduce the playback rate of the bag file.
roslaunch <your_robot_calibration> collect_data.launch output_folder:=<your_dataset_folder> bag_rate:=<playback_rate>Here are some examples of the system collecting data:
| Atlascar2 | AgrobV2 |
|---|---|
![]() |
![]() |
| ... | UR10e eye to_base |
|---|---|
| ... | ![]() |
A dataset is a folder which contains a set of collections. There, a data_collected.json file stores all the information required for the calibration. There are also in the folder images and point clouds associated with each collection.
Finally, a system calibration is called through:
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=~/datasets/<my_dataset>/data_collected.jsonYou can use a couple of launch file arguments to configure the calibration procedure, namely
- single_pattern [false] -
- use_incomplete_collections [false] - Remove any collection which does not have a detection for all sensors.
- ssf [false] - Sensor Selection Function: A string to be evaluated into a lambda function that receives a sensor name as input and returns True or False to indicate if the sensor should be loaded (and used in the optimization). An example:
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=$ATOM_DATASETS/<my_dataset>/data_collected.json ssf:='lambda name: name in ["camera1, "lidar2"]' - csf [false] - Collection Selection Function: A string to be evaluated into a lambda function that receives a collection name as input and returns True or False to indicate if that collection should be loaded (and used in the optimization). An example:
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=$ATOM_DATASETS/<my_dataset>/data_collected.json csf:='lambda name: int(name) < 7'
Alternatively, for debugging the calibrate script it is better not to have it executed with a bunch of other scripts which is what happens when you call the launch file. You can run everything with the launch excluding without the calibrate script
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=~/datasets/<my_dataset>/data_collected.json run_calibration:=false and then launch the script in standalone mode
rosrun atom_calibration calibrate -json dataset_file:=~/datasets/<my_dataset>/data_collected.json There are several additional command line arguments to use with the calibrate script, here's an extensive list:
usage: calibrate [-h] [-sv SKIP_VERTICES] [-z Z_INCONSISTENCY_THRESHOLD]
[-vpv] [-vo] -json JSON_FILE [-v] [-rv] [-si] [-oi] [-pof]
[-sr SAMPLE_RESIDUALS] [-ss SAMPLE_SEED] [-od] [-fec] [-uic]
[-rpd] [-ssf SENSOR_SELECTION_FUNCTION]
[-csf COLLECTION_SELECTION_FUNCTION]
optional arguments:
-h, --help show this help message and exit
-json JSON_FILE, --json_file JSON_FILE
Json file containing input dataset.
-vo, --view_optimization
Displays generic total error and residuals graphs.
-v, --verbose Be verbose
-rv, --ros_visualization
Publish ros visualization markers.
-si, --show_images shows images for each camera
-oi, --optimize_intrinsics
Adds camera instrinsics and distortion to the optimization
-pof, --profile_objective_function
Runs and prints a profile of the objective function,
then exits.
-sr SAMPLE_RESIDUALS, --sample_residuals SAMPLE_RESIDUALS
Samples residuals
-ss SAMPLE_SEED, --sample_seed SAMPLE_SEED
Sampling seed
-uic, --use_incomplete_collections
Remove any collection which does not have a detection
for all sensors.
-rpd, --remove_partial_detections
Remove detected labels which are only partial. Used or
the Charuco.
-ssf SENSOR_SELECTION_FUNCTION, --sensor_selection_function SENSOR_SELECTION_FUNCTION
A string to be evaluated into a lambda function that
receives a sensor name as input and returns True or
False to indicate if the sensor should be loaded (and
used in the optimization). The Syntax is lambda name:
f(x), where f(x) is the function in python language.
Example: lambda name: name in ["left_laser",
"frontal_camera"] , to load only sensors left_laser
and frontal_camera
-csf COLLECTION_SELECTION_FUNCTION, --collection_selection_function COLLECTION_SELECTION_FUNCTION
A string to be evaluated into a lambda function that
receives a collection name as input and returns True
or False to indicate if the collection should be
loaded (and used in the optimization). The Syntax is
lambda name: f(x), where f(x) is the function in
python language. Example: lambda name: int(name) > 5 ,
to load only collections 6, 7, and onward.It is also possible to call some of these through the launch file. Check the launch file to see how.
So far, we have used ATOM to successfully calibrate several robotic platforms:
Atlascar2 is an intelligent vehicle containing several cameras and 2D Lidars.
This includes several variants of the hand-eye calibration problem.
Agrob is a mobile robot with a stereo camera and a 3D Lidar designed for agriculture robotics.
- Miguel Riem Oliveira - University of Aveiro
- Afonso Castro - University of Aveiro
- Eurico Pedrosa - University of Aveiro
- Tiago Madeira - University of Aveiro
- André Aguiar - INESC TEC
- Miguel Riem Oliveira - University of Aveiro
- Eurico Pedrosa - University of Aveiro







