Learning Fuzzy Set Representations of Partial Shapes on Dual Embedding Spaces
Conditionally accepted to SGP 2018
This neural-network-based framework analyzes an uncurated collection of 3D models from the same category and learns two important types of semantic relations among full and partial shapes: complementarity and interchangeability. The former helps to identify which two partial shapes make a complete plausible object, and the latter indicates that interchanging two partial shapes from different objects preserves the object plausibility. These two relations are modeled as fuzzy set operations performed across the dual partial shape embedding spaces, and within each space, respectively, and jointly learned by encoding partial shapes as fuzzy sets in the dual spaces.
- Python-gflags (tested with ver. 3.1.2)
- Networkx (tested with ver. 2.1)
- Numpy (tested with ver. 1.14.2)
- Pandas (tested with ver. 0.23.0)
- Scipy (tested with ver. 1.0.1)
- TensorFlow-gpu (tested with ver. 1.4.0)
Reproducing paper results
Download ComplementMe component point cloud data:
cd data ./download_complement_me_data.sh cd ..
The partial shapes used in the paper can be generated from the components by the following script:
cd data ./batch_generate_partial_objects.sh cd ..
Pretrained model download
We provide pretrained models for all categories. Run:
cd fuzzy_set_dual/experiments ./download_pretrained_models.sh cd ../../
You can also easily train the network from scratch. Specify a category of shapes to train (set one of directory names in
data/components. e.g. Chair. Case-sensitive):
Move to the experiment directory:
For learning complementarity, run:
./run_experiment.py --relative --train
For learning interchangeability, run:
The trained models are stored in
fuzzy_set_dual/experiments/($synset)/vanilla_100_centerize directories, respectively.
The retrieval of complementary and interchangeable partial shapes are performed by runnning the same
run_experiment.py script without the
For retrieving with all test partial shapes in all categories as quries, run:
Regenerating paper figures/tables.
We provide script files in figures regenerating results in the paper figures and tables.
For running the script files, first download/generate all data, download pretrained models, and run the batch retrieval scripts as described above. The outputs of each figure script are stored as mesh files, and the results of compared methods are not generated. Also, the position of retrieved complement partial shapes are not predicted since it is not a part of this work.
Note that the scripts have a dependency with MeshLab. Ubuntu users can install with apt-get:
sudo apt-get install meshlab
This code is released under the MIT License. Refer to LICENSE for details.