This work was build upon Thibault Groueix's AtlasNet and 3D-CODED projects. (you might want to have a look at those)
This repository contains the source codes for the paper AtlasNet V2 - Learning Elementary Structures.
If you find this work useful in your research, please consider citing:
@inproceedings{deprelle2019learning,
title={Learning elementary structures for 3D shape generation and matching},
author={Deprelle, Theo and Groueix, Thibault and Fisher, Matthew and Kim, Vladimir and Russell, Bryan and Aubry, Mathieu},
booktitle={Advances in Neural Information Processing Systems},
pages={7433--7443},
year={2019}
}
The project page is available http://imagine.enpc.fr/~deprellt/atlasnet2/
This implementation uses Pytorch.
## Download the repository
git clone https://github.com/TheoDEPRELLE/AtlasNetV2.git
cd AtlasNetV2
## Create python env with relevant packages
conda create --name atlasnetV2 python=3.7
source activate atlasnetV2
pip install pandas visdom
conda install pytorch torchvision -c pytorch
conda install -c conda-forge matplotlib
# you're done ! Congrats :)
cd data; ./download_data.sh; cd ..
We used the ShapeNet dataset for 3D models.
When using the provided data make sure to respect the shapenet license.
- The point clouds from ShapeNet, with normals go in
data/customShapeNet
- The corresponding normalized mesh (for the metro distance) goes in
data/ShapeNetCorev2Normalized
- the rendered views go in
data/ShapeNetRendering
The trained models and some corresponding results are also available online :
- The trained_models go in
trained_models/
The chamfer loss is based on a custom cuda code that need to be compile.
source activate pytorch-atlasnet
cd ./extension
python setup.py install
- First launch a visdom server :
python -m visdom.server -p 8888
- Check out all the options :
git pull; python training/train.py --help
- Run the baseline :
git pull; python training/train.py --model AtlasNet --adjust mlp
git pull; python training/train.py --model AtlasNet --adjust linear
- Run the Patch Deformation module with the different adjustment modules :
git pull; python training/train.py --model PatchDeformation --adjust mlp
git pull; python training/train.py --model PatchDeformation --adjust linear
- Run the Point Translation module with the different adjustment modules:
git pull; python training/train.py --model PointTranslation --adjust mlp
git pull; python training/train.py --model PointTranslation --adjust linear
- Monitor your training on http://localhost:8888/
The models train on the SURREAL dataset for the FAUST competition can be found here
This work was partly supported by ANR project EnHerit ANR-17-CE23-0008, Labex Bezout, and gifts from Adobe to Ecole des Ponts.