Skip to content

Striving for Simplicity: The All Convolutional Net (All-CNN-C)

Notifications You must be signed in to change notification settings

chituma110/ALL-CNN

 
 

Repository files navigation

ALL-CNN

This is the implementation of All-CNN-C model for CIFAR-10 from the paper Striving for Simplicity: The All Convolutional Net by Jost Tobias Springenberg, Alexey Dosovitskiy, Thomas Brox, Martin Riedmiller, accepted as a workshop contribution at ICLR 2015.

ALL_CNN_C

The best saved model, from iteration 52000, achieves 90.25% accuracy on the test set. It was trained on Caffe version from commit 5a201dd960840c319cefd9fa9e2a40d2c76ddd73.

Dataset

The training was on global contrast normalized and ZCA whitened CIFAR-10 whithout any data augmentation. Both training and test sets were created with Pylearn2 library using make_cifar10_gcn_whitened.py script that outputs train.pkl and test.pkl files. Then transformed to LMDB database with random, unique keys (Creating an LMDB database in Python).

Here are the links to download preprocessed and ready for training/testing CIFAR-10 LMDB databases:

Training

caffe train -solver ALL_CNN_C_solver.prototxt -gpu all |& tee ALL_CNN_C.log

I found the training unstable. You may need to run it more than once to achieve sub 10% error rate.

On NVIDIA TITAN X GPU the training took about 2.5 h.

loss

accuracy

Testing

caffe test -model ALL_CNN_C_train_val.prototxt -weights ALL_CNN_C_iter_52000.caffemodel

About

Striving for Simplicity: The All Convolutional Net (All-CNN-C)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 100.0%