Skip to content
[CVPR'17] Shape Completion using 3D-Encoder-Predictor CNNs and Shape Synthesis
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
benchmark train/test files Apr 11, 2017
evaluate evaluation code Apr 12, 2017
external flann Apr 12, 2017
imgs reduce size Apr 12, 2017
shapesynth shape synthesis Apr 12, 2017
LICENSE.txt license Apr 20, 2018


This repo contains code to train a volumetric deep neural network to complete partially scanned 3D shapes. More information can be found in our paper.


Train/test data is available for download on our project website.



Training tasks use Torch7, with torch packages cudnn, cunn, torch-hdf5, xlua.

Matlab visualization of the isosurface in testing uses the matio package.

The shape synthesis code was developed under VS2013, and uses flann (included in external).


  • th train_class.lua -model epn-unet-class -save logs-epn-unet-class -train_data data/h5_shapenet_dim32_sdf/train_shape_voxel_data_list.txt -test_data data/h5_shapenet_dim32_sdf/test_shape_voxel_data_list.txt -gpu_index 0
  • For more options, see help: th train_class.lua -h or th train.lua -h
  • Trained models: (700mb)


  • th test.lua --model_path [path to model] --test_file sampledata/scan.h5 --output_path [path to output] --classifier_path [path to classifier model, only specify if using epn-class or epn-unet-class models]
  • For more options, see help: th test.lua -h


  title={Shape Completion using 3D-Encoder-Predictor CNNs and Shape Synthesis},
  author={Dai, Angela and Qi, Charles Ruizhongtai and Nie{\ss}ner, Matthias},
  booktitle = {Proc. Computer Vision and Pattern Recognition (CVPR), IEEE},
  year = {2017}


This code is released under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (please refer to LICENSE.txt for details).


If you have any questions, please email Angela Dai at

You can’t perform that action at this time.