Toolkit for VSLAM and VISLAM evaluation.
For more information, please refer to our project website.
This project is released under the Apache 2.0 license.
Usage:
./accuracy <groundtruth> <input> <fix scale>
Arguments:
<groundtruth> Path to sequence folder, e.g. ~/VISLAM-Dataset/A0.
<input> SLAM camera trajectory file in TUM format(timestamp[s] px py pz qx qy qz qw).
<fix scale> Set to 1 for VISLAM, set to 0 for VSLAM.
Usage:
./initialization <groundtruth> <input> <has inertial>
Arguments:
<groundtruth> Path to sequence folder, e.g. ~/VISLAM-Dataset/A0.
<input> SLAM camera trajectory file in TUM format(timestamp[s] px py pz qx qy qz qw).
<has inertial> Set to 1 for VISLAM, set to 0 for VSLAM.
Usage:
./robustness <groundtruth> <input> <fix scale>
Arguments:
<groundtruth> Path to sequence folder, e.g. ~/VISLAM-Dataset/A0.
<input> SLAM camera trajectory file in TUM format(timestamp[s] px py pz qx qy qz qw).
<fix scale> Set to 1 for VISLAM, set to 0 for VSLAM.
Usage:
relocalization <groundtruth> <input> <has inertial>
Arguments:
<groundtruth> Path to sequence folder, e.g. ~/VISLAM-Dataset/A0.
<input> SLAM camera trajectory file in TUM format(timestamp[s] px py pz qx qy qz qw).
<has inertial> Set to 1 for VISLAM, set to 0 for VSLAM.
If you are using our codebase or dataset for research, please cite the following publication:
@article{
title={Survey and Evaluation of Monocular Visual-Inertial SLAM Algorithms for Augmented Reality},
author={Jinyu Li, Bangbang Yang, Danpeng Chen, Nan Wang, Guofeng Zhang*, Hujun Bao*},
journal={Journal of Virtual Reality & Intelligent Hardware},
year={2019},
url = {http://www.vr-ih.com/vrih/html/EN/10.3724/SP.J.2096-5796.2018.0011/article.html},
doi = {10.3724/SP.J.2096-5796.2018.0011}
}