Skip to content

silentz/Towards-Faster-And-Stabilized-GAN-Training-For-High-Fidelity-Few-Shot-Image-Synthesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

67 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Towards Faster and Stabilized GAN Training for High Fidelity Few-shot Image Synthesis

Implementation of FastGAN from https://arxiv.org/abs/2101.04775.

Dataset

Few-shot images dataset from default configuration is automatically downloaded from http://silentz.ml/few-shot-images.zip. In case domain has expired, you can download archive manually and unpack it to data/ directory inside project root. NOTE: you can download few-shot-images.zip from github releases of this repository. Final data/ directory layout should look like this:

data/
├── few-shot-images
│   ├── anime
│   ├── art
│   ├── cat_faces
│   ├── dog_faces
│   ├── grumpy_cat
│   ├── moongate
│   ├── obama
│   ├── panda
│   ├── pokemon
│   ├── shells
│   └── skulls
└── few-shot-images.zip

Training losses

gen_loss - generator loss

disc_real_loss - discriminator loss on real images

disc_fake_loss - discriminator loss on images from generator

How to reproduce

  1. Clone repository:
git clone https://github.com/silentz/Towards-Faster-And-Stabilized-GAN-Training-For-High-Fidelity-Few-Shot-Image-Synthesis.git
  1. Cd into repository root:
cd Towards-Faster-And-Stabilized-GAN-Training-For-High-Fidelity-Few-Shot-Image-Synthesis
  1. Create and activate virtualenv:
virtualenv --python=python3 venv
source venv/bin/activate
  1. Install required packages:
pip install -r requirements.txt
  1. Download dataset (should be automatic, but if not, see section above).

  2. Choose one of dataset configs (located in train/configs dir):

train/configs/
├── anime.yaml
├── art.yaml
├── cat_faces.yaml
├── dog_faces.yaml
├── grumpy_cat.yaml
├── moongate.yaml
├── obama.yaml
├── panda.yaml
├── pokemon.yaml
├── shells.yaml
└── skulls.yaml
  1. Train model:
python -m train fit --config train/configs/shells.yaml
  1. Create torchscript model:
python -m train.export --config train/configs/shells.yaml --from_ckpt checkpoints/epoch=0-step=49999.ckpt
  1. Run inference script:
python infer.py export/shells.pt # path to exported model (see config)

Samples

Trained FastGAN torchscript model is located in second release (see "releases" section of github page of repository). Here are samples of generated images (model is trained on shells dataset):