Skip to content
Extreme Relative Pose Estimation for RGB-D Scans via Scene Completion
Branch: master
Clone or download
Latest commit 78d5f3b Mar 31, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
RPModule cleanup Mar 31, 2019
data/relativePoseModule pub Jan 2, 2019
datasets pub Jan 2, 2019
model cleanup Jan 6, 2019
utils pub Jan 2, 2019
LICENSE add license Jan 10, 2019
README.md cleanup Jan 6, 2019
baselines.py pub Jan 2, 2019
config.py pub Jan 2, 2019
evaluation.py upload dataset Jan 3, 2019
mainFeatureLearning.py cleanup Jan 6, 2019
mainPanoCompletion2view.py cleanup Mar 31, 2019
opts.py pub Jan 2, 2019
overview.png pub Jan 2, 2019
quaternion.py pub Jan 2, 2019
trainRelativePoseModuleRecFD.py cleanup Jan 6, 2019
util.py cleanup Jan 6, 2019

README.md

Extreme Relative Pose Estimation for RGB-D Scans via Scene Completion

Pytorch implementation of paper "Extreme Relative Pose Estimation for RGB-D Scans via Scene Completion"

alt tag

Prerequisites:

Folder Organization

please make sure to have following folder structure:

RelativePose/
    data/
        dataList/
        pretrained_model/
    experiments/
    tmp/

Dataset Download

images: suncg,matterport,scannet
data list: suncg,matterport,scannet
pretrained model: suncg,matterport,scannet
Images should be uncompressed under data/ folder. The data list contains the split used in our experiments, and should be placed under data/dataList/ folder. The pretrained model should be placed under data/pretrained_model/ folder.

Usage

training feature network

# suncg 
python mainFeatureLearning.py --exp featSuncg --g --batch_size=2 --featurelearning=1 --maskMethod=second --resume --dataList=suncg --outputType=rgbdnsf --snumclass=15
# matterport 
python mainFeatureLearning.py --exp featMatterport --g --batch_size=2 --featurelearning=1 --maskMethod=second --resume --dataList=matterport --outputType=rgbdnsf --snumclass=15
# scannet 
python mainFeatureLearning.py --exp featScannet --g --batch_size=2 --featurelearning=1 --maskMethod=kinect --resume --dataList=scannet --outputType=rgbdnsf --snumclass=21

training completion module

# suncg 
python mainPanoCompletion2view.py --exp compSuncg--g --batch_size=2 --featurelearning=1 --maskMethod=second --resume --dataList=suncg --outputType=rgbdnsf --snumclass=15
# matterport 
python mainPanoCompletion2view.py --exp compMatterport --g --batch_size=2 --featurelearning=1 --maskMethod=second --resume --dataList=matterport --outputType=rgbdnsf --snumclass=15
# scannet 
python mainPanoCompletion2view.py --exp compScannet  --g --batch_size=2 --featurelearning=1 --maskMethod=kinect --resume --dataList=scannet --outputType=rgbdnsf --snumclass=21 --useTanh=0

train relative pose module

python trainRelativePoseModuleRecFD.py --exp fd_param --dataset=suncg --snumclass=15 --split=val --para_init={param for previous iter} --rlevel={recurrent level}

The trained parameters for relative pose module are provided in data/relativePoseModule/

Evaluation

python evaluation.py --dataList={suncg,matterport,scannet} --method={ours,ours_nr,ours_nc,gs,cgs,super4pcs} --exp=eval --num_repeat=10 --para={param file}

Noted that you need place Super4PCS binary under the RelativePose/ in order to run its evaluation.

Author

Zhenpei Yang

You can’t perform that action at this time.