Skip to content
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Branch: master
Clone or download
Latest commit 2e9ed33 Dec 10, 2018
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
models pruned models Dec 3, 2018
scripts pruned models Dec 3, 2018
utils pruned models Dec 3, 2018
README.md
infer_pruned.py
original_train.py
pruning_cifar10_resnet.py pruning cifar resnet Nov 7, 2018
pruning_train.py
utils.py

README.md

Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks

The PyTorch implementation for our IJCAI 2018 paper. This implementation is based on ResNeXt-DenseNet.

Table of Contents

Requirements

  • Python 3.6
  • PyTorch 0.3.1
  • TorchVision 0.3.0

Models and log files

The trained models with log files can be found in Google Drive.

The pruned model without zeros: Release page.

Training ImageNet

Usage of Pruning Training

We train each model from scratch by default. If you wish to train the model with pre-trained models, please use the options --use_pretrain --lr 0.01.

Run Pruning Training ResNet (depth 152,101,50,34,18) on Imagenet: (the layer_begin and layer_end is the index of the first and last conv layer, layer_inter choose the conv layer instead of BN layer):

python pruning_train.py -a resnet152 --save_dir ./snapshots/resnet152-rate-0.7 --rate 0.7 --layer_begin 0 --layer_end 462 --layer_inter 3  /path/to/Imagenet2012

python pruning_train.py -a resnet101 --save_dir ./snapshots/resnet101-rate-0.7 --rate 0.7 --layer_begin 0 --layer_end 309 --layer_inter 3  /path/to/Imagenet2012

python pruning_train.py -a resnet50  --save_dir ./snapshots/resnet50-rate-0.7 --rate 0.7 --layer_begin 0 --layer_end 156 --layer_inter 3  /path/to/Imagenet2012

python pruning_train.py -a resnet34  --save_dir ./snapshots/resnet34-rate-0.7 --rate 0.7 --layer_begin 0 --layer_end 105 --layer_inter 3  /path/to/Imagenet2012

python pruning_train.py -a resnet18  --save_dir ./snapshots/resnet18-rate-0.7 --rate 0.7 --layer_begin 0 --layer_end 57 --layer_inter 3  /path/to/Imagenet2012

Usage of Initial with Pruned Model

We use unpruned model as initial model by default. If you wish to initial with pruned model, please use the options --use_sparse --sparse path_to_pruned_model.

Usage of Normal Training

Run resnet(100 epochs):

python original_train.py -a resnet50 --save_dir ./snapshots/resnet50-baseline  /path/to/Imagenet2012 --workers 36

Inference the pruned model with zeros

sh scripts/inference_resnet.sh

Inference the pruned model without zeros

sh scripts/infer_pruned.sh

The pruned model without zeros could be downloaded at the Release page.

Scripts to reproduce the results in our paper

To train the ImageNet model with / without pruning, see the directory scripts (we use 8 GPUs for training).

Training Cifar-10

sh scripts/cifar10_resnet.sh

Please be care of the hyper-parameter layer_end for different layer of ResNet.

Notes

Torchvision Version

We use the torchvision of 0.3.0. If the version of your torchvision is 0.2.0, then the transforms.RandomResizedCrop should be transforms.RandomSizedCrop and the transforms.Resize should be transforms.Scale.

Why use 100 epochs for training

This can improve the accuracy slightly.

Process of ImageNet dataset

We follow the Facebook process of ImageNet. Two subfolders ("train" and "val") are included in the "/path/to/ImageNet2012". The correspding code is here.

FLOPs Calculation

Refer to the file.

Citation

@inproceedings{he2018soft,
  title     = {Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks},
  author    = {He, Yang and Kang, Guoliang and Dong, Xuanyi and Fu, Yanwei and Yang, Yi},
  booktitle = {International Joint Conference on Artificial Intelligence (IJCAI)},
  pages     = {2234--2240},
  year      = {2018}
}
You can’t perform that action at this time.