This project by Baris Zöngür and Munzer Dwedari is part of the 'Advanced Deep Learning for Computer Vision' course at the chair of Prof. Dr. Niessner at Technical University of Munich. Both team members contributed equally. You can view our final report here.
- Python 3
- Install dependencies from
requirements.txt
pip install -r requirements.txt
- Install chamferdist package by krrish94
pip:
pip install chamferdist
conda: download the repository to the root of this project, then run python setup.py install
that is inside the folder chamferdist/
Note: we only tested it with conda
- Download Shapenet dataset and unzip the file under
data/
Your directory should then look like this:
data/
shapenet_dim32_df/
02691156/...
02747177/...
...
splits/
shapenet/
airplane_test.txt
airplane_train.txt
...
To train a variational auto-decoder use the following command
python scripts/train.py --var <experiment_name> <class>
Available classes: car, airplane, chair, sofa, lamp, cabinet, watercraft, table
To train a non-variational auto-decoder, use the following command
python scripts/train.py <experiment_name> <class>
Available classes: same as above
To test a trained variational auto-decoder on the validation data, use the following command
python scripts/train.py --var --test <experiment_name> <class>
To test a trained non-variational auto-decoder on the validation data use the following command
python scripts/train.py --test <experiment_name> <class>
To evaluate a variational auto decoder on the 1-NN score, use the following command
python scripts/evaluate.py --split test <experiment_name> <class> 1NN
If you wish to test with a fewer number of samples from the reference set, use the --n 200
flag
To evaluate a non-variational auto decoder on the IOU score, use the following command
python scripts/evaluate.py --split test <experiment_name> <class> IOU
split: train, val
For visulizing samples from shape synthesis, inter-class and intra-class interpolation we prepared a jupyter notebook visualize.ipynb
.
You can use tensorboard to see the losses (under logs/
) during training and testing.
tensorboard --logdir logs