Skip to content
Go to file


Failed to load latest commit information.
Latest commit message
Commit time
Jun 15, 2020
Jun 14, 2020

AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks

Yonggan Fu, Wuyang Chen, Haotao Wang, Haoran Li, Yingyan Lin, Zhangyang Wang

Accepted at ICML 2020 [Paper Link].


We propose AutoGAN-Distiller (AGD) Framework, among the first AutoML frameworks dedicated to GAN compression, and is also among a few earliest works that explore AutoML for GANs.


  • AGD is established on a specifically designed search space of efficient generator building blocks, leveraging knowledge from state-of-the-art GANs for different tasks.
  • It performs differentiable neural architecture search under the target compression ratio (computational resource constraint), which preserves the original GAN generation quality via the guidance of knowledge distillation.
  • We demonstrate AGD on two representative mobile-based GAN applications: unpaired image translation (using a CycleGAN), and super resolution (using an encoder-decoder GAN).

Visualization Results

Unpaired image translation:


Super Resolution:



Unpaired Image Translation

horse2zebra, zebra2horse, summer2winter, winter2summer: Unpaired-dataset

Super Resolution

Training (DIV2K+Flickr2K): SR-training-dataset

Evaluation (Set5, Set14, BSD100, Urban100): SR-eval-dataset



AGD_ST and AGD_SR are the source codes for unpaired image translation task and super resolution task respectively. The codes for pretrain, search, train from scratch and eval are in the AGD_ST/search and AGD_SR/search directory.

We use AGD_ST/search as an example. All the configurations during pretrain, search, train from scratch, eval are in, and respectively. Please specify the target dataset C.dataset and change the dataset path C.dataset_path in the three config files to the real paths on your PC.


See env.yml for the complete conda environment. Create a new conda environment:

conda env create -f env.yml
conda activate pytorch

In partiqular, if the thop package encounters some version conflicts, please specify the thop version:

pip install thop==0.0.31.post1912272122

Step 1: Pretrain the Supernet

  • Switch to the search directory:
cd AGD_ST/search
  • Set C.pretrain = True in

  • Start to pretrain:


The checkpoints during pretraining are saved at ./ckpt/pretrain.

Step 2: Search

  • Set C.pretrain = 'ckpt/pretrain' in

  • Start to search:


Step 3: Train the derived network from scratch

  • Set C.load_path = 'ckpt/search' in

  • Start to train from scratch:


Step 4: Eval

  • Set C.load_path = 'ckpt/search' and C.ckpt = 'ckpt/finetune/' in
  • Start to evaluate on the testing dataset:

The result images are saved at ./output/eval/.

Two differences in Super Resolution tasks

1st Difference

Please download the checkpoint of original ESRGAN (teacher model) from pretrained ESRGAN and move it to the directory AGD_SR/search/ESRGAN/.

2nd Difference

The step 3 is splitted into two steps, i.e., first pretrain the derived architecture with only content loss and then finetune with perceptual loss:

  • Pretrain: Set C.pretrain = True in

  • Finetune: Set C.pretrain = 'ckpt/finetune_pretrain/' in

Pretrained Models

Pretrained models are provided at pretrained AGD.

To evaluate the pretrained models, please copy the network architecture definition and pretrained weights to the corresponding directories:

cp ckpt/search/
cp ckpt/finetune/

then do the evaluation following step 4.

Our Related Work

Please also check our concurrent work on a unified optimization framework combining model distillation, channel pruning and quantization for GAN compression:

Haotao Wang, Shupeng Gui, Haichuan Yang, Ji Liu, and Zhangyang Wang. "All-in-One GAN Compression by Unified Optimization." ECCV, 2020. (Spotlight) [pdf] [code]


[ICML2020] "AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks" by Yonggan Fu, Wuyang Chen, Haotao Wang, Haoran Li, Yingyan Lin, Zhangyang Wang





No releases published


No packages published


You can’t perform that action at this time.