Skip to content
code for "Isolating Sources of Disentanglement in Variational Autoencoders".
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
lib Reshape data tensor to 1x64x64 Apr 23, 2019
metric_helpers initial commit Apr 13, 2018
.gitignore initial commit Apr 13, 2018
LICENSE Create LICENSE Sep 13, 2018
README.md Update README.md Jan 12, 2019
disentanglement_metrics.py consistent args Apr 15, 2018
elbo_decomposition.py initial commit Apr 13, 2018
plot_latent_vs_true.py initial commit Apr 13, 2018
vae_quant.py Fix comment Jun 28, 2018

README.md

beta-TCVAE

This repository contains cleaned-up code for reproducing the quantitative experiments in Isolating Sources of Disentanglement in Variational Autoencoders [arxiv].

Usage

To train a model:

python vae_quant.py --dataset [shapes/faces] --beta 6 --tcvae

Specify --conv to use the convolutional VAE. We used a mlp for dSprites and conv for 3d faces. To see all options, use the -h flag.

The main computational difference between beta-VAE and beta-TCVAE is summarized in these lines.

To evaluate the MIG of a model:

python disentanglement_metrics.py --checkpt [checkpt]

To see all options, use the -h flag.

Datasets

dSprites

Download the npz file from here and place it into data/.

3D faces

We cannot publicly distribute this due to the license. Please contact me for the data.

Contact

Email rtqichen@cs.toronto.edu if you have questions about the code/data.

Bibtex

@inproceedings{chen2018isolating,
  title={Isolating Sources of Disentanglement in Variational Autoencoders},
  author={Chen, Ricky T. Q. and Li, Xuechen and Grosse, Roger and Duvenaud, David},
  booktitle = {Advances in Neural Information Processing Systems},
  year={2018}
}
You can’t perform that action at this time.