Skip to content
[ECCV18] Constraint-Aware Deep Neural Network Compression
Branch: intel_scnn
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Add Github issue template to curb misuse. Nov 3, 2016
cmake update the installation instructions and cmake file Jan 21, 2018
docker Update Dockerfile to cuDNN v5 May 16, 2016
examples add dataset dtd Mar 8, 2018
include fix finetuning learning rate and caffe sgd bug Feb 8, 2018
models sync Apr 1, 2018
scripts fix many typos by using codespell Nov 27, 2016
src fix finetuning learning rate and caffe sgd bug Feb 8, 2018
tools incorporate optimizations from Intel Caffe Jan 13, 2017
.Doxyfile update doxygen config to stop warnings Sep 3, 2014
CMakeLists.txt update the installation instructions and cmake file Jan 21, 2018 [docs] add which will appear on GitHub new Issue/PR p… Jul 30, 2015 clarify the license and copyright terms of the project Aug 7, 2014 installation questions -> caffe-users Oct 19, 2015
LICENSE copyright 2015 Jun 23, 2015
Makefile don't use Fontran when compiling libxsmm to remove dependency to Fort… Jun 14, 2017
Makefile.config.example no longer support old cuda version May 24, 2017 update sampling picture Sep 11, 2018
caffe.cloc [fix] stop cloc complaint about cu type Sep 4, 2014
cfp.config modidy batch size to utilize gpu Mar 17, 2018

Constraint-Aware Deep Neural Network Compression

Given a real-time operational constraints, this library can automatically compress the network to satisfy the constraint while preserving the accuracy. This library is built on top of SkimCaffe, which has implemented direct sparse convolution operations and has an effective speedup given an input sparse network. This framework can be applied to different constraint types(latency, memory size), different network(Alexnet, Resnet, Googlenet) and different datasets(ImagenNet, DTD).

For more technical details, please refer to the paper.

Framework Overview


  1. First follow the build-up instructions in SkimCaffe to build SkimCaffe.
  2. Modify the cfp.config file to adapt to your own need(constraint values).
  3. Run training in root directory:
  1. To visualize the training result, specify the output directory:
python2 pruning/ [OUTPUT_DIR]


Visualization of sampled data points in the first exponential cooling step



If you use this code or ideas from the paper for your research, please cite our paper:

author = {Chen, Changan and Tung, Frederick and Vedula, Naveen and Mori, Greg},
title = {Constraint-Aware Deep Neural Network Compression},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
You can’t perform that action at this time.