Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

Gift from Iterative Network Pruning

This repository contains the code for reproducing the results for paper Paying more attention to Snapshot of Iterative Pruning: Improving Model Compression via Ensemble Distillation (BMVC 2020).


In short, the paper propose to leverage the snapshots of iterative pruning to construct ensembles and distilling knowledge from them. To stimulate the diversity between each snapshots, we use One-cycle schedule to retrain the pruned networks. Thus, each snapshot is encouraged to converge to different optimal solution.

The algorithm is summarized below:

  1. Train the baseline network to completion.
  2. Prune redundant weights (based on some criteria).
  3. Retrain with One-cycle learning rate.
  4. Repeat step 2 and 3 until desired compression ratio is reached.
  5. Distill knowledge from ensemble to desired network.

How to run

Please checkout example.pynb for detail instruction to reproduce the results on CIFAR. Instruction for running experiments on Tiny-Imagenet might be updated later.

We also provided the scripts for repeative pruning and knowledge distillation (read Sec.5 in Colab example). Disclamer: you might have to modify the checkpoint_paths variable in to appropriate paths (and by that I mean cifar/filter_pruning/, cifar/weight_pruning/,... depending on your chosen method/dataset).


CIFAR-10 and CIFAR-100.

Results on CIFAR


Results on Tiny-Imagenet

PFEC and MWP stand for Pruning Filters for Efficient ConvNets and Learning both Weights and Connections for Efficient Neural Networks respectively.


The code is mostly taken from Eric-mingjie's repository