The proposed Generative Adversarial Networks (GAN) is a conditional Generative Adversarial Networks (cGAN) based on CycleGAN. The main difference is that we will be passing the labels such that the discriminator can compare the generated pokemon with a real pokemon that belongs to the same type of animal.
We have also used StyleGAN2 as a baseline to compare with the GAN that we proposed. The following image is the results of the generated images.
Link to dataset for CycleGAN: HERE
Link to dataset for StyleGAN: HERE
Link to models: HERE
For cGAN, dataset.json
is required, but need to delete the file if using the normal GAN.
Move the dataset to folders:
-
StyleGAN: DO NOT unzip
pkmn_label.zip
, move to./stylegan/datasets
. -
CycleGAN: Unzip
dataset_cyclegan.zip
. MovetrainA
andtrainB
to./cyclegan/train_datasets
. To test images, create another folder calltestA
.
Move the model to folders:
-
StyleGAN:
stylegan_conditional.pkl
andstylegan_unconditional.pkl
to./stylegan/training_runs
. -
CycleGAN:
animal2pkmn_cond
,animal2pkmn_new
andanimal2pkmn_xavier
to./cyclegan/checkpoints
.
Refer to respective Jupyter Notebook file. StyleGAN.ipynb
and CycleGAN.ipynb
.
Karras, T., Aittala, M., Hellsten, J., Laine, S., Lehtinen, J., & Aila, T. (2020). Training Generative Adversarial Networks with Limited Data. Proc. NeurIPS. Available at: https://github.com/NVlabs/stylegan2-ada-pytorch.
Zhu, J.-Y., Park, T., Isola, P., & Efros, A. A. (2017). Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks. Computer Vision (ICCV), 2017 IEEE International Conference On. Available at: https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix
PokeAPI, The RESTful Pokémon API. Available at: https://github.com/PokeAPI/pokeapi