Skip to content

VITA-Group/TransGAN

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
September 13, 2021 12:12
June 21, 2021 11:44
August 5, 2021 00:01
June 21, 2021 11:40
February 27, 2021 15:53
November 2, 2022 21:00
February 14, 2021 19:08
June 23, 2021 12:24
June 15, 2021 20:10
February 14, 2021 19:08
August 29, 2021 13:19
August 28, 2021 11:30

TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up

Code used for TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up.

Implementation

  • checkpoint gradient using torch.utils.checkpoint
  • 16bit precision training
  • Distributed Training (Faster!)
  • IS/FID Evaluation
  • Gradient Accumulation
  • Stronger Data Augmentation
  • Self-Modulation

Guidance

Cifar training script

python exp/cifar_train.py

I disabled the evaluation during training job as it causes strange bug. Please launch another evaluation job simultaneously by copying the path to test script.

Cifar test

First download the cifar checkpoint and put it on ./cifar_checkpoint. Then run the following script.

python exp/cifar_test.py

Main Pipeline

Main Pipeline

Representative Visual Results

Cifar Visual Results Visual Results

README waits for updated

Acknowledgement

Codebase from AutoGAN, pytorch-image-models

Citation

if you find this repo is helpful, please cite

@article{jiang2021transgan,
  title={Transgan: Two pure transformers can make one strong gan, and that can scale up},
  author={Jiang, Yifan and Chang, Shiyu and Wang, Zhangyang},
  journal={Advances in Neural Information Processing Systems},
  volume={34},
  year={2021}
}

About

[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published