Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?


Failed to load latest commit information.
Latest commit message
Commit time
December 3, 2020 03:03
December 3, 2020 03:03
March 30, 2021 20:24
December 3, 2020 03:03
March 30, 2021 22:07
November 20, 2020 09:42
March 19, 2021 17:55
November 20, 2020 02:19
December 2, 2021 20:43
March 30, 2021 20:24


A Low-cost and Accurate Lidar-assisted Visual SLAM System

Point cloud density of Livox lidar Horizon as a function of integration time, the pattern does not repeat itself.


We propose CamVox by adapting Livox lidars into visual SLAM (ORB-SLAM2) by exploring the lidars’ unique features. Based on the non-repeating nature of Livox lidars, we propose an automatic lidar-camera calibration method that will work in uncontrolled scenes.


The long depth detection range also benefit a more efficient mapping. Comparison of CamVox with visual SLAM (VINS-mono) and lidar SLAM (livox_horizon_loam) are evaluated on the same dataset to demonstrate the performance.

Developer: Yuewen Zhu, Chunran Zheng, Chongjian Yuan, Xu Huang.

Our related video: our related videos are now available on [YouTube Video] [bilibili Video] [OwnCloud]

Paper: our related paper has been posted on arXiv, and final ICRA2021 accepted version can be available CamVox.pdf.

Lidar-Camera Automatic calibration follow-up work: "Pixel-level Extrinsic Self Calibration of High Resolution LiDAR and Camera in Targetless Environments".

1. Prerequisites

1.1 Ubuntu and ROS

Ubuntu 64-bit 16.04 or 18.04. ROS Kinetic or Melodic. Follow ROS Installation.
(Recommended version Ubunutu 16.04 LTS kernel version 4.15.0-140-generic)

1.2 OpenCV

We use OpenCV to manipulate images and features. Follow Opencv Installation. Required at leat 2.4.3. Tested with OpenCV 2.4.11 and OpenCV 3.4.1.

   (1)  *** Install dependencies: ***
	sudo apt-get install build-essential libgtk2.0-dev libavcodec-dev libavformat-dev  libswscale-dev libjasper-dev   
   (2)  *** Install opencv-3.4.1: ***
   	cd opencv-3.4.1
	mkdir build && cd build
	cmake -D CMAKE_BUILD_TYPE=Release -D CMAKE_INSTALL_PREFIX=/usr/local ..
	make -j
	sudo make install
   (4)  *** Add opencv libraries to path: ***
	sudo gedit /etc/ 
        *** add in the file's end: ***
	sudo ldconfig
   (5)  *** bash configurtion: ***
	sudo gedit /etc/bash.bashrc  
	*** add in the end: ***
	source configurtion:
	source /etc/bash.bashrc  
	sudo updatedb

(Recommended version

1.3 PCL-1.7

We use PCL to deal with point cloud features and Lidar-camera extrinsic parameters calibration. PCL Installation version recommended as follows.

   (1)  *** update your host: ***
   	sudo apt-get update
   (2)  *** install VTK ***
        cd VTK-8.2.0
        mkdir build && cd build
        cmake ..
        make -j
        sudo make install
   (3)  *** install pcl: ***
        sudo apt-get install libpcl-dev pcl-tools
        sudo apt-get install freeglut3-dev

(Recommended VTK version VTK-8.2.0.tar.gz)

1.4 Pangolin

We use Pangolin for visualization and user interface. Follow Pangolin Installation.

   (1)  *** install dependencies: ***
   	sudo apt-get install libglew-dev libpython2.7-dev libboost-dev libboost-thread-dev libboost-filesystem-dev -y
   (3)  *** install Pangolin: ***
        cd Pangolin
        mkdir build && cd build
        cmake ..
        make -j
        sudo make install

(Recommended version

1.5 Ceres Solver

Follow Ceres Installation.

   (1)  *** install dependencies: ***
        sudo apt-get install liblapack-dev 
        sudo apt-get install libsuitesparse-dev 
        sudo apt-get install libcxsparse3.1.4 
        sudo apt-get install libgflags-dev 
        sudo apt-get install libgoogle-glog-dev libgtest-dev
   (2)  *** install ceres: ***
        cd ceres-solver-1.14.0
        mkdir build && cd build
        cmake ..
        make -j 
        sudo make install

(Recommended version Ceres-solver-1.14.0.tar.gz)

1.6 Eigen

Follow Eigen Installation. Required at least 3.1.0.

   (1)  *** install eigen: ***
        cd eigen
        mkdir build && cd build
        cmake ..
        make -j 
        sudo make install

(Recommended version Eigen-3.2.10.tar.gz)

1.7 Livox-SDK

Follow Livox-SDK Installation. (Recommended version

1.8 MVS camera SDK and Ros driver

Install the HIKROBOT camera SDK as follows.

    tar zxvf MVS-2.0.0_x86_64_20191126.tar.gz
    cd ./MVS-2.0.0_x86_64_20191126
    chmod +x
    sudo ./

In addition, we supply a (software trigger Hikvisions' compatible Ros drivers), you can run it directly just use USB to connect the camera.

    roslaunch mvs_camera mvs_camera.launch

(Recommended version MVS-2.0.0.tar.gz)

2. Build CamVox

Clone the repository and catkin_make:

    cd ~/catkin_ws/src
    git clone
    cd CamVox/isee-camvox && chmod a+x && chmod a+x
    source ~/catkin_ws/devel/setup.bash
    cd CamVox/isee-camvox/Vocabulary
    tar zxvf ORBvoc.txt.tar.gz

3. Run with Hardware

3.1 Hardware

Platform Item Pics Shopping Link
Livox Horizon Lidar
MV-CE060-10UC Camera
Inertial Sense uINS RTK
Manifold2C Onboard-Computer
Scout-mini Robot Chassis

3.2 Hard Synchronization

Hard synchronization is performed with all of these sensors by a trigger signal of 10 Hz. The camera output at each trigger signal(10 Hz). The lidar keeps a clock (synced with GPS-RTK) and continuously outputs the scanned point with an accurate timestamp. In the meantime, the IMU outputs at a frequency of 200 Hz synced with the trigger. The Hardware Synchronization diagram is as follows.

3.3 Running

Connect to your PC to Livox Horizon lidar by following Livox-ros-driver installation.

    cd ~/catkin_ws/src/CamVox/
    chmod +x

4. Run with Rosbag Example

4.1 SUSTech Expert Flat Outdoor large scale scenes (With Loop Closure)


We open sourced our dataset in SUSTech campus with loop closure. You can download the bag file from google drive CamVox.bag or zenodo CamVox.bag. (Updated)

Other two main framework data formats for comparison. VINS-mono.bag | livox_loam_horizon.bag | Groundtruth.bag (Updated)

show show
The comparisons of the trajectories from CamVox, two mainstream SLAM framework and the ground truth are evaluated on our SUSTech dataset as shown in above figure .

4.2 Rosbag Example with static scenes (Automatic Calibration trigger)


We provide a rosbag file with static scenes to test the automatic calibration thread. calibration.bag. (Updated) When the car detects more than 10 frames of still images (about 1 second), the automatic calibration thread starts to work. The thread will be interrupted to enter the SLAM mode if the car starts to move before the end of calibration. The effects of automatic calibration is shown as follows.

show show

An example of RGB camera and point cloud overlay after calibration. (a) not calibrated. (b) automatically calibrated. (c) the best manual calibration. The automatic calibration algorithms is verified at various scenes, (d) outdoor road with natural trees and grasses, (e) outdoor artificial structures, (f) indoor underexposed structures. (g-i) represent the cost value evolution in the optimization process corresponding to the scenes on the left.


4.3 Running Rosbag Examples

4.3.1 running without saving trajectory and colored pcd files

    cd CamVox/isee-camvox
    rosrun online camvox Vocabulary/ORBvoc.bin camvox/online/Livox.yaml 
    rosbag play CamVox.bag (or calibration.bag)

4.3.2 saving trajectory and colored pcd files after finishing specified frames (e.g. 1000)

    cd CamVox/isee-camvox
    rosrun online camvox Vocabulary/ORBvoc.bin camvox/online/Livox.yaml 1000
    rosbag play CamVox.bag (or calibration.bag)

5. Acknowledgements

The authors are grateful for the pioneering work from ORB_SLAM2, ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras. The authors also sincerely thank colleagues at Livox Technology for help in data aquisition and discussion. This work is from ISEE Research Group at SUSTech.

6. License

The source code is released under GPLv2.0 license.

If you use CamVox in an academic work, please cite:

(**Updated**)  @misc{zhu2020camvox,
  title={CamVox: A Low-cost and Accurate Lidar-assisted Visual SLAM System}, 
  author={Yuewen Zhu and Chunran Zheng and Chongjian Yuan and Xu Huang and Xiaoping Hong},


[ICRA2021] A low-cost SLAM system based on camera and Livox lidar.







No releases published


No packages published