Visual-Inertia-fusion-based Monocular dEnse mAppiNg
This repository corresponds to our paper on ICRA 2017:
Real-time Monocular Dense Mapping on Aerial Robots using Visual-Inertial Fusion
How to view the 3D model
The following rosbags might be helpful to take a close look at the dense 3D model produced by our system.
Topic of Meshed dense map: /Chisel/full_mesh
Topic of estimated path: /self_calibration_estimator/path
Note: Unfortunately, the model is too dense to be recorded real-time using our onboard computers. ROS randomly throw data with limited IO, resulting very low frequency of the messages in the bags.
We emphasize the map received by controller is updated at 10Hz. The frequency of map generation really matters a lot for autonomous systems.
How to compile
- Choose correct
- Install the following dependencies:
- Ceres (http://ceres-solver.org/)
- Modified OpenChisel (already included in this repo), which requires PCL compiled with c++11. Please follow OpenChisel's instruction.
- Subset of camodocal (already inclued in this repo with name of
camera_model) to support wide-angle lens.
How to run it with our data
- Download sample.bag via: http://uav.ust.hk/storage/sample.bag
roslaunch stereo_mapper sample_all.launch
rosbag play path_to_the_bag/sample.bag
- Use rviz with your config file
How to run it with your data
- Calibrate your camera using
camera_modelto get the calibration file that looks like
- Modify the launch file to fit your topic names and the path to your calibration file
- Follow the above steps
VI-MEAN is licensed under the GNU General Public License Version 3 (GPLv3), see http://www.gnu.org/licenses/gpl.html.
For commercial purposes, a more efficient version under different licencing terms is under development.