Skip to content

BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression

Notifications You must be signed in to change notification settings

SatoshiRobatoFujimoto/ACCV_TinyGAN

 
 

Repository files navigation

TinyGAN

BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression

Python 3.7 PyTorch 1.2.0

This repository contains the official PyTorch implementation of the following paper:

TinyGAN: Distilling BigGAN for Conditional Image Generation (ACCV 2020)
Ting-Yun Chang and Chi-Jen Lu

https://arxiv.org/abs/2009.13829

Abstract: Generative Adversarial Networks (GANs) have become a powerful approach for generative image modeling. However, GANs are notorious for their training instability, especially on large-scale, complex datasets. While the recent work of BigGAN has significantly improved the quality of image generation on ImageNet, it requires a huge model, making it hard to deploy on resource-constrained devices. To reduce the model size, we propose a black-box knowledge distillation framework for compressing GANs, which highlights a stable and efficient training process. Given BigGAN as the teacher network, we manage to train a much smaller student network to mimic its functionality, achieving competitive performance on Inception and FID scores but with the generator having 16 times fewer parameters.

The trained model is in gan/models (73 MB) and can be directly downloaded from Github.

Training

$ bash train.sh

Evaluation

$ bash eval.sh

Fig

Fig

About

BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.0%
  • Shell 2.0%