Skip to content

Code for paper "Which Training Methods for GANs do actually Converge? (ICML 2018)"

License

Notifications You must be signed in to change notification settings

darwinmorris/GAN_stability

 
 

Repository files navigation

ML-SGAN

This repository contains a modified version of the GAN presented in the paper Which Training Methods for GANs do actually Converge?..

  • configs for experiments can be found in ./configs
  • important configs for ML-SGAN:
    • lp_dict is path to json dictionary of LP classes to labels
    • labsize is the label array size
    • nlabels is the number of classes after LP transformation

GAN stability

To cite this work, please use

@INPROCEEDINGS{Mescheder2018ICML,
  author = {Lars Mescheder and Sebastian Nowozin and Andreas Geiger},
  title = {Which Training Methods for GANs do actually Converge?},
  booktitle = {International Conference on Machine Learning (ICML)},
  year = {2018}
}

You can find further details on our project page.

Usage

First download your data and put it into the ./data folder.

To train a new model, first create a config script similar to the ones provided in the ./configs folder. You can then train you model using

python train.py PATH_TO_CONFIG

To compute the inception score for your model and generate samples, use

python test.py PATH_TO_CONFIG

Finally, you can create nice latent space interpolations using

python interpolate.py PATH_TO_CONFIG

or

python interpolate_class.py PATH_TO_CONFIG

About

Code for paper "Which Training Methods for GANs do actually Converge? (ICML 2018)"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%