Home

Steven Waslander edited this page May 11, 2016 · 35 revisions
Clone this wiki locally

Multi-Camera Parallel Tracking and Mapping (MCPTAM)

MCPTAM is a set of ROS nodes for running Real-time 3D Visual Simultaneous Localization and Mapping (SLAM) using Multi-Camera Clusters. It includes tools for calibrating both the intrinsic and extrinsic parameters of the individual cameras within the rigid camera rig.

MCPTAM is released under the GNU General Public License, version 3.

 

Download

Latest Stable Release

  • 0.1.3 (2015-03-12)

Source can be exported from the GitHub repository using:

$ git clone https://github.com/wavelab/mcptam.git
$ git checkout REL-0.1.3

Latest Development Source Code

The latest source can be checked out of the GitHub repository at https://github.com/wavelab/mcptam.git. This is the master branch, and so should be fully functional at all times. Please report any issues you have when using it.

Related Publications

[1] A Harmat, M Trentini, and I Sharf. Multi-Camera Tracking and Mapping for Unmanned Aerial Vehicles in Unstructured Environments. In Journal of Intelligent and Robotic Systems, August 2014. (pdf) ([bib] (bib_harmat_2014)).

[2] Michael J. Tribou, Adam Harmat, David W.L. Wang, Inna Sharf, and Steven L. Waslander. Multi-camera parallel tracking and mapping with non-overlapping fields of view. In The International Journal of Robotics Research, 34(12):1480-1500, October 2015. doi:10.1177/0278364915571429. (pdf) (bib)

 

Videos

Videos of MCPTAM running on video streams from a Quadrotor UAV

 

Installation

To get started and test out MCPTAM on recorded (bagged) data, visit the Quick-Start Guide.

To install manually, see [Detailed Installation Guide](Detailed Installation Guide).

To use the intrinsic and extrinsic Taylor model camera calibration tools, see [Camera Calibration](Camera Calibration).

It is possible to run MCPTAM in a Client-Server configuration where the client tracks the real-time pose of the cluster with respect to the map which is generated on the server. See this Client/Server Guide.

 

Overview

ScreenShot

Click the image above to watch the video!

MCPTAM is an extension of the Parallel Tracking and Mapping [PTAM] software from Isis Innovation Limited. It is a collaborative project within the NSERC Canadian Field Robotics Network [NCFRN].

The MCPTAM system is a modified version of the PTAM software for use with multiple heterogeneous wide-angle field-of-view (FOV) lens central cameras rigidly connected together into a cluster. The software solves the 3D Simultaneous Localization and Mapping (SLAM) problem in real-time using only the images from the set of component cameras through a relative motion trajectory. The algorithm has been completely integrated into the Robot Operating System [ROS] development environment.

The camera model of PTAM is replaced by a Taylor omnidirectional camera model capable of representing devices with FOV greater than 180 degrees. To accommodate for the large radial distortions in these types of cameras, MCPTAM modifies the point feature patch-warping and matching process to search on the image-space epipolar arcs.

The PTAM Bundle Adjustment (BA) back-end has been completely replaced by an optimization algorithm using the flexible g2o framework. The rigid cluster configuration for the component cameras is represented as a collection of Keyframes (KF) with known position and orientation with respect to a base Multi-Keyframe (MKF), which itself has a position and orientation in the world frame which must be estimated. In this parameterization, the KFs correspond to the individual camera coordinate frames and the MKFs correspond to the poses of the camera cluster through its motion trajectory within the unknown environment.

Unlike the PTAM algorithm, the MCPTAM system is capable of recovering the correct global scale of the motion and structure by taking advantage of FOV overlap if it exists between the component cameras. When a point feature is observed in two cameras at the same point in time, the depth of that feature can be accurately triangulated using the known calibration between the cameras. Along these lines, the image-based map initialization phase of PTAM is replaced by constructing an initial map of point features observed and triangulated within the intersection of the overlapping FOV of the cluster cameras at the first time step.

If there is not sufficient FOV overlap to fully initialize the system in this way, MCPTAM is still able to resolve the full SLAM solution. The known extrinsic calibration between the cameras, along with the relative motion of the camera cluster and target environment, allow for the full motion and structure recovery despite no overlap in the camera FOVs. This allows for the cameras to be arranged to cover as wide a collective FOV as possible and provide well-constrained localization estimates. It also maximizes the chances of observing stable trackable feature points in the environment to prevent the system from becoming lost.

 

MCPTAM Roadmap

A detailed system architecture description for the next major release of MCPTAM and a roadmap for future development ar be found here..

Authors

Adam Harmat from the Aerospace Mechatronics Lab [AML] at McGill University
LinkedIn Profile

Michael Tribou from the Waterloo Autonomous Vehicle Laboratory [WAVELab] at the University of Waterloo
LinkedIn Profile
WAVELab Profile
Multicamera Cluster SLAM Research
YouTube Channel

 

License

Multi-Camera Parallel Tracking and Mapping (MCPTAM) is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.