Skip to content

A visual odometry pipeline for the VAMR course at ETHZ/UZH.

Notifications You must be signed in to change notification settings

aroumie1997/visual-odometry-project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Monocular Visual Odometry Project

The aim of this mini-project is to implement a monocular VO (visual odometry) pipeline that can be used to estimate the trajectory of moving vehicles. Over the course of this project, some fundamental computer vision algorithms are implemented, particularly those which enable: initialization of 3D landmarks, keypoint tracking between two frames, pose estimation using established 2D-3D correspondences, and triangulation of new landmarks. These essential features of VO were presented in the lectures of the ETHZ/UZH course "Vision Algorithms for Mobile Robotics" and some of them were implemented as building blocks during the exercise sessions. The pipeline builds upon those building blocks, and the result is a fully-functional monocular VO algorithm.

Requirements

The following requirements/recommendations are listed:

Note: The screencasts to test the pipeline on the different datasets were produced on the following machine:

  • Processor: 2.2 GHz Intel Core i7
  • Memory: 16 GB 2400 MHz DDR4
  • Graphics: Radeon Pro 555X 4096 MB/ Intel UHD Graphics 630 1536 MB
  • Processor: 2.2 GHz Intel Core i7
  • Model: MacBook Pro (15-inch, late 2018)

Features

  • Visualization of matched features for triangulation keyframes (used as landmarks)
  • Visualization of matched features for localization frames
  • History of the number of tracked landmarks over the past 20 frames
  • Plot of the global trajectory estimate
  • Plot of the global ground truth trajectory (or GPS data in the case of the custom datasets)
  • Plot of the trajectory estimate of the past 20 frames (local trajectory)
  • Scatter plot of the tracked landmarks

Sample Output

Demo Instructions

Step 1: Clone the repo (needs ssh setup):

  $ git clone git@github.com:aroumie1997/visual-odometry-project.git

Or (https):

  $ git clone https://github.com/aroumie1997/visual-odometry-project.git

Step 2: Navigate into the repo:

  $ cd visual-odometry-project

Step 3: Move the downloaded dataset folders into the repo.

Step 4: Open MATLAB and navigate to the Code folder:

  $ cd Code

Step 5: Run main.m in MATLAB.

Note: The code prompts you to enter a number corresponding to the dataset you want to test.

Step 6 (Optional): Run main_truthScale.m in MATLAB.

Runs version of implementation where translation scale between keyframes is synced with ground truth data.

Note: The code prompts you to enter a number corresponding to the dataset you want to test. Malaga dataset is not allowed as it does not have ground truth data.

References

[1] Vision Algorithms for Mobile Robotics Webpage

[2] Computer Vision: Algorithms and Applications, by Richard Szeliski, Springer, 2010.

[3] An Invitation to 3D Vision, by Y. Ma, S. Soatto, J. Kosecka, S.S. Sastry.

[4] Robotics, Vision and Control: Fundamental Algorithms, 2nd Ed., by Peter Corke 2017.

[5] Multiple view Geometry, by R. Hartley and A. Zisserman.

[6] Chapter 4 of "Autonomous Mobile Robots", by R. Siegwart, I.R. Nourbakhsh, D. Scaramuzza.

About

A visual odometry pipeline for the VAMR course at ETHZ/UZH.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages