Skip to content

An implementation of the deep convolutional generative adversarial network, combined with a varational autoencoder

Notifications You must be signed in to change notification settings

staturecrane/dcgan_vae_torch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Convolutional Variational Autoencoder w/ Generative Adversarial Network

A combination of the DCGAN implementation by soumith and the variational autoencoder by Kaixhin.

The model produces 64x64 images from inputs of any size via center cropping. You can modify the code relatively easily to produce different sized outputs (adding more convolutional layers, for instance), as well as to rescale images instead of cropping them. Images are randomly flipped horizontally to get better coverage on training data.

I have added white noise to the original inputs that go through the discriminator after reading this post on stabilizing GANS. The noise level is annealed over time to help the generator and discriminator converge.

Results on Wikimedia Paintings Dataset

Prerequisites

  1. Torch7
  2. CUDA
  3. CUDNN
  4. DPNN
  5. Lua File System
  6. optim
  7. xlua

To run, execute the script using

th dcgan_vae.lua -i [input folder destination] -o [output folder destination] -c [destination for saving model checkpoints] -r [reconstructions folder]

where the input folder is expected to contain color images. The model resamples the training set after every epoch so as to fit on a GPU and still (eventually) sample all of the data. "Output" is for samples generated by the model, and "reconstructions folder" is to just save some reconstructions from the training set, to see how the VAE is doing (it's not going to do particularly well, but that's okay; it's there to assist the GAN).

About

An implementation of the deep convolutional generative adversarial network, combined with a varational autoencoder

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages