Skip to content

This is my own implementation and understanding of the original paper paper

Notifications You must be signed in to change notification settings

emedinac/DeepCompression

Repository files navigation

PyTorch-Deep-Compression

This is own implementation and understanding of the paper Deep Compression developed in June/July 2017 (except huffmann coding =D ), and very simple reports were produced on November. sharing weight stage needs to be optimized, it is extremely slow, but do not require many epochs to converge.

My tasks to solve

Learn Pytorch for low-level (gradient modification) and high-level implementation (large networks). Learn a very efficient network optimization (2015). Currently, there is a new optimization method that I would like to learn =) MorphNet

Usage

All parameters are detailed in main.py, just run:

python main.py

Recommended number of epochs in pruning is 25, while 5 epochs are enough in the sharing stage.

Update results

Trained models were based on VGG19 and its custom versions, however results presented here are VGG19 (with batchnorm) trained on CIFAR10. Experiments were performed again in order to test if pytorch 1.0.1 works correctly on variable and gradient manipulations.

Results

Results reported on VGG19 in this repository using k clusters keep almost the same accuracy same as the original paper argued. All optimization results were trained in an overall of 25 epochs. Pruning iteration number was to set 25 as well (just for convenience).

Netwrok Original Pruned 25 Shared k=4 Shared k=9 Shared k=13 Shared k=35
VGG19_BN 92.22 92.18 90.93 91.86 92.23 (soon =D)

Figures

Some visual results of the pruned weights are shown below:

Releases

Release 1.2

Usage updated...

Release 1.1

Presentation and Figures were uploaded.

Release 1.0

Code now work for pytorch 1.0.1. Trained models differ from the original ones reported in the presentation.

About

This is my own implementation and understanding of the original paper paper

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages