Skip to content

nnUyi/PGGAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PGGAN

A implement of PGGAN for tensorflow version(progressive growing GANs for improved quality, stability and variation)

Requirement

  • pytorch 0.2.0_4

  • python 2.7.12

  • numpy 1.13.1

  • scipy 0.17.0

Usage

download repo

$ git clone https://github.com/nnUyi/PGGAN.git
$ cd PGGAN

download dataset

  • download dataset and store it in the datasets directory(directory named datasets) and then unzip it. Here I show the example of your dataset storage: /datasets/celebA. In this repo, celebA data is used to train the model, with no attributes and labels used in training time.

  • I just use original celebA dataset, and target resolution is setted to 128*128 because I can not download the delta data provided by NVIDIA.

  • Anyway, if your want to get 1024*1024 dataset, you can see official codes here, h5tool.py is the script used to create the target training datasets.

training

$ python main.py --is_training=True 

sampling

  • Sampling process is executed in training time. You can see the sampling results in the directory named sample

Experiments

  • The result shows below, we can clearly obtain such a good experimental result.

  • Here I just show you two types rsolution including 64_64 and 128_128, the first four columns in 64_64 images are sampling data while the other four columns are real images. This is the same for 128_128 images.

sampling image sampling image
Alt test Alt test
64*64 resolution 64*64 resolution
Alt test Alt test
128*128 resolution 128*128 resolution

Reference

 This repo is finished by referring to github-pengge/PyTorch-progressive_growing_of_gans

Contacts

Email:computerscienceyyz@163.com, Thank you for contacting if you find something wrong or if you have some problems!!!

About

A implement of PGGAN for tensorflow version(progressive growing GANs for improved quality, stability and variation)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages