Skip to content

Implementation of SqueezeNet by chainer (SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size: https://arxiv.org/abs/1602.07360)

License

Notifications You must be signed in to change notification settings

nutszebra/squeeze_net

Repository files navigation

What's this

Implementation of SqueezeNet by chainer

Dependencies

git clone https://github.com/nutszebra/squeeze_net.git
cd squeeze_net
git submodule init
git submodule update

How to run

python main.py -p ./ -e 300 -b 64 -g 0 -s 1 -trb 1 -teb 1 -lr 0.1

Details about my implementation

My squeezenet is with simple bypass and most network parameters are same as in [1]. However, the implementation slightly differs from the original implemenatation [1].

  • Fire module
    As [2] is reported, the order of BN_ReLU_Conv works well for residual networks, thus Fire module is composed of three BN_ReLU_Conv layers.
  • Optimization
    Optimization and hyperparameters are same as in [3].
  • Data augmentation
    Train: Pictures are resized in the range of [124, 132], then 122x122 patches are extracted randomly and are normalized locally. Horizontal flipping is applied with 0.5 probability.
    Test: Pictures are randomly resized to 128x128, then they are normalized locally. Single image test is used to calculate total accuracy.

Cifar10 result

network total accuracy (%)
Alexnet without data augmentation[4] 82
Squeezenet [1] 92.63

loss

total accuracy

References

SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size [1]
Identity Mappings in Deep Residual Networks [2]
Densely Connected Convolutional Networks [3]
Alexnet without data augmentation [4]

About

Implementation of SqueezeNet by chainer (SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size: https://arxiv.org/abs/1602.07360)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages