Skip to content

Taeyoung96/Visual-odometry-tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Visual-odometry-tutorial

You could see the Video lecture on Youtube.

The original material is available in bitbucket.

I just fixed some directory in code, provided dataset and requirement.txt!
I tested this repository on Ubuntu 18.04.

Contents

How to setup enviornmet

You should have Anaconda.

1. Make virtual environment using Anaconda

First, just clone this repository.

git clone https://github.com/Taeyoung96/Visual-odometry-tutorial.git  
cd Visual-odometry-tutorial

Make virtual environment with python version 3.8.

conda create -n vo_tutorial python=3.8
conda activate

If you follow above command lines, you enter the virtual environment.
Then install required python packages.

pip install -r requirements.txt

2. Prepare the dataset

In this tutorial, there are using two famous Dataset.

But these datasets are very large.
So I have prepared the minimal dataset for this tutorial on Google drive.

These datasets have their licences, so don't use these datasets for commercial uses!

If you want to match the default directory path, unzip it in the following path.
In data/ folder, there are two folders named kitti-odom/ and tum/.
.txt file exists in each folder, and you can unzip the dataset in the same path as .txt file.

If you are finished, the dataset path is as follows.

  • KITTI dataset - 09 sequence : ~[YOUR_DIR]/Visual-odometry-tutorial/data/kitti-odom.
  • TUM dataset - freiburg2_desk sequence : ~[YOUR_DIR]/Visual-odometry-tutorial/data/tum.

How to run this repository

There are 3 python files that you can run it.

In cv_basics/ folder there are epipolar.py and feature_matching.py.

  • feature_matching.py : Visualize the result of feature matching using two consecutive images.
    We extracted keypoints using ORB features and did feature math with brute-force matching algorthm.

  • epipolar.py : Visualize the epipolar line using two consecutive images.

When you want to execute feature_matching.py on default directory (~/Visual-odometry-tutorial)
follow codes below.

cd cv_basics  
python feature_matching.py

feature_matching.py Result

When you want to execute epipolar.py on default directory (~/Visual-odometry-tutorial)
follow codes below.

cd cv_basics  
python epipolar.py

epipolar.py Result

In visual-odometry/ folder there is vo.py

When you want to execute vo.py on default directory (~/Visual-odometry-tutorial)
follow codes below.

When you finish the code, you could get Trajectory.png and Trajectory.txt!.

cd visual-odometry 

There are 3 arguments, when run the code below.

  • '--data_dir_root' : Set the dataset path.
    (When you follow 2. Prepare the dataset you could use deafult path. default='../data/')
  • '--dataset_type' : Decide which dataset to use. (default='TUM')
  • '--len_trajMap' : Specifies the size of the trajectory visualized window. (default=700)

If you want to run code with KITTI dataset,

python vo.py --dataset_type='KITTI'

vo.py Result (KITTI)

Or if you want to run code with TUM dataset,

python vo.py --dataset_type='TUM'

vo.py Result (TUM)

The results are not accurate. This is because we are not doing any optimizations, we are just estimating.

Contact

If you have any question, feel free to send an email.

About

Tutorial code for "AirLab Summer School Session 2.1"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages