Skip to content
code/webpage for the DeepMapping project
Python Shell
Branch: master
Clone or download
Latest commit 40da376 Jul 1, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data/2D add data Jun 10, 2019
dataset_loader add code Jun 10, 2019
docs Update index.md Jun 10, 2019
loss add code Jun 10, 2019
models add code Jun 10, 2019
script add code Jun 10, 2019
utils remove f-string formatting Jun 10, 2019
.gitignore add code Jun 10, 2019
LICENSE Update LICENSE Jun 10, 2019
README.md add code Jun 10, 2019
requirements.txt Update requirements.txt Jun 30, 2019

README.md

DeepMapping: Unsupervised Map Estimation From Multiple Point Clouds

This repository contains PyTorch implementation associated with the paper:

"DeepMapping: Unsupervised Map Estimation From Multiple Point Clouds", Li Ding and Chen Feng, CVPR 2019 (Oral).

Citation

If you find DeepMapping useful in your research, please cite:

@InProceedings{Ding_2019_CVPR,
author = {Ding, Li and Feng, Chen},
title = {DeepMapping: Unsupervised Map Estimation From Multiple Point Clouds},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}

Dependencies

Requires Python 3.x, PyTorch, Open3D, and other common packages listed in requirements.txt

pip3 install -r requirements.txt

Running on GPU is highly recommended. The code has been tested with Python 3.6.5, PyTorch 0.4.0 and Open3D 0.4.0

Getting Started

Dataset

A set of 2D simulated point clouds is provided as ./data/2D/v1_pose0.tar. Extract the tar file:

tar -xvf ./data/2D/v1_pose0.tar -C ./data/2D/

A new sub-directory ./data/2D/v1_pose0/ will be created. In this folder, there are 256 local point clouds saved in PCD file format. The corresponding ground truth sensor poses is saved as gt_pose.mat file, which is a 256-by-3 matrix. The i-th row in the matrix represent the sensor pose [x,y,theta] for the i-th point cloud.

Solving Registration As Unsupervised Training

To run DeepMapping, execute the script

./script/run_train_2D.sh

By default, the results will be saved to ./results/2D/.

Warm Start

DeepMapping allows for seamless integration of a “warm start” to reduce the convergence time with improved performance. Instead of starting from scratch, you can first perform a coarse registration of all point clouds using incremental ICP

./script/run_icp.sh

The coarse registration can be further improved by DeepMapping. To do so, simply set INIT_POSE=/PATH/TO/ICP/RESULTS/pose_est.npy in ./script/run_train_2D.sh. Please see the comments in the script for detailed instruction.

Evaluation

The estimated sensor pose is saved as numpy array pose_est.npy. To evaluate the registration, execute the script

./script/run_eval.sh

Absolute trajectory error will be computed as error metrics.

You can’t perform that action at this time.