Skip to content

timbmg/VAE-CVAE-MNIST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Variational Autoencoder & Conditional Variational Autoenoder on MNIST

VAE paper: Auto-Encoding Variational Bayes

CVAE paper: Semi-supervised Learning with Deep Generative Models


In order to run conditional variational autoencoder, add --conditional to the the command. Check out the other commandline options in the code for hyperparameter settings (like learning rate, batch size, encoder/decoder layer depth and size).


Results

All plots obtained after 10 epochs of training. Hyperparameters accordning to default settings in the code; not tuned.

z ~ q(z|x) and q(z|x,c)

The modeled latent distribution after 10 epochs and 100 samples per digit.

VAE CVAE

p(x|z) and p(x|z,c)

Randomly sampled z, and their output. For CVAE, each c has been given as input once.

VAE CVAE

About

Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages