95.16% on CIFAR10 with PyTorch
Branch: master
Clone or download
Latest commit 3407511 Nov 5, 2018
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
models Update ShuffleNetV2 Nov 5, 2018
LICENSE Add licence Nov 16, 2017
README.md Update MobileNetV2 accuracy Jan 26, 2018
main.py Update ShuffleNetV2 Nov 5, 2018
utils.py Turn on download by default Jul 17, 2017

README.md

Train CIFAR10 with PyTorch

I'm playing with PyTorch on the CIFAR10 dataset.

Pros & cons

Pros:

  • Built-in data loading and augmentation, very nice!
  • Training is fast, maybe even a little bit faster.
  • Very memory efficient!

Cons:

  • No progress bar, sad :(
  • No built-in log.

Accuracy

Model Acc.
VGG16 92.64%
ResNet18 93.02%
ResNet50 93.62%
ResNet101 93.75%
MobileNetV2 94.43%
ResNeXt29(32x4d) 94.73%
ResNeXt29(2x64d) 94.82%
DenseNet121 95.04%
PreActResNet18 95.11%
DPN92 95.16%

Learning rate adjustment

I manually change the lr during training:

  • 0.1 for epoch [0,150)
  • 0.01 for epoch [150,250)
  • 0.001 for epoch [250,350)

Resume the training with python main.py --resume --lr=0.01