Skip to content
tensorflow implementation of CondenseNet: An Efficient DenseNet using Learned Group Convolutions
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
data first Jan 26, 2018 Change error rate Jan 29, 2018 first Jan 26, 2018 simplify Feb 1, 2018 add parameters Feb 1, 2018 get rid of bias Feb 1, 2018

CondenseNet tensorflow

Tensorflow implementation of CondenseNet: An Efficient DenseNet using Learned Group Convolutions. The code is tested with cifar10, inference phase not implemented yet.

Model architecture

Official PyTorch implementation by @ShichenLiu here.




  • Go to data/ folder and run python2 --data-dir=./cifar-10-data. This code is directly borrowed from tensorflow official repo and have to be run with python 2.7+.


Use default parameters:


Check out tunable hyper-parameters:

python --help

Other parameters including stages, groups, condense factor, and growth rate are in


  • Training for 300 epochs with the default settings reach testing accuracy 93.389% (paper report is 94.94%). There might be some details I didn't notice, feel free to point them out.
  • All the default parameters settings follow the paper/official pytorch implementation.
  • Current implmentations of standard group convolution and learned group convolution are very inefficient (a bunch of reshape, transpose and concat), looking for help to build much more efficient graph.
  • Evaluation phase (index select) has not been implemented yet, looking for potential help as well :D.
  • Issues are welcome!


You can’t perform that action at this time.