Wasserstein Auto-Encoders
Switch branches/tags
Nothing to show
Clone or download
Latest commit 6351565 Jun 28, 2018
Permalink
Failed to load latest commit information.
images Uploaded examplary output image Mar 9, 2018
LICENSE Init Nov 10, 2017
README.md Readme pic Mar 9, 2018
configs.py WAE MMD++ Jun 28, 2018
datahandler.py Fixed small bug in number of pictures for MNIST Mar 7, 2018
improved_wae.py WAE MMD++ Jun 28, 2018
models.py MLP linear May 7, 2018
ops.py Initial commit Jan 11, 2018
run.py WAE MMD++ Jun 28, 2018
utils.py Initial commit Jan 11, 2018
wae.py WAE MMD++ Jun 28, 2018

README.md

Repository info

This project implements an unsupervised generative modeling technique called Wasserstein Auto-Encoders (WAE), proposed by Tolstikhin, Bousquet, Gelly, Schoelkopf (2017).

Repository structure

wae.py - everything specific to WAE, including encoder-decoder losses, various forms of a distribution matching penalties, and training pipelines

run.py - master script to train a specific model on a selected dataset with specified hyperparameters

Example of output pictures

The following picture shows various characteristics of the WAE-MMD model trained on CelebA after 50 epochs:

WAE-MMD progress