Skip to content

makezur/super_primitive

Repository files navigation

SuperPrimitive: Scene Reconstruction at a Primitive Level

Kirill Mazur · Gwangbin Bae · Andrew J. Davison

CVPR 2024

preview

Getting Started

Installation

git clone https://github.com/makezur/super_primitive.git --recursive
cd super_primitive

Setup the environment by running our installation script:

source install.sh

Note that the provided software was tested on Ubuntu 20.04 with a single Nvidia RTX 4090.

Downloading Checkpoints and Data

To download the required checkpoints and datasets, please run our download script:

bash ./download.sh

The script will download the pre-trained checkpoints for both SAM and surface normal estimation networks. A replica scene and TUM_fr1 sequences will also be downloaded and unpacked automatically.

N.B. in case of the system CUDA version mismatch you might have to change the pytorch-cuda version in the installation script.

Running Demo (Structure-from-Motion)

Run the following script for a minimal example of our SuperPrimitive-based joint pose and geometry estimation. Here, we estimate a relative pose between two frames and the depth of the source frame.

python sfm_gui_runner.py --config config/replica_sfm_example.yaml

Monocular Visual Odometry (TUM)

Run our MonoVO on a TUM sequence by executing the following command:

python sfm_gui_runner.py --config config/tum/odom_desk.yaml --odom

Evaluation

We provide a tool to convert estimated trajectories into the TUM format.

Conversion and subsequent evaluation of the $Sim(3)$-aligned absolute trajectory error (ATE) can be done with the following commands:

python convert_traj_to_tum.py --root results/desk_undistort_fin_TIMESTAMP
cd results/desk_undistort_fin_TIMESTAMP
evo_ape tum converted_gt_tum_traj.txt converted_tum_traj.txt -as --plot --plot_mode xy --save_results ./res.zip

Depth Completion (VOID)

To download VOID please follow the official instructions.

Please run this script to reproduce the quantitative evaluation of the SuperPrimitive-based depth completion described in the paper:

python evaluate_void.py --dataset PATH_TO_VOID_DATASET

Acknowledgments

Our code draws a lot from the DepthCov codebase and we want to give a special thanks to its author.

We additionally thank the authors of the following codebases that made our project possible:

Citation

If you found our code/work useful, please consider citing our publication:

@inproceedings{Mazur:etal:CVPR2024,
title={{SuperPrimitive}: Scene Reconstruction at a Primitive Level},
author={Kirill Mazur and Gwangbin Bae and Andrew Davison},
booktitle={IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2024},
}

Releases

No releases published

Packages

No packages published