Snapshot Ensembles: Train 1, Get M for Free
This repository contains the Torch code for the paper Snapshot Ensembles: Train 1, Get M for Free.
Table of Contents
Snapshot Ensemble is a method to obtain ensembles of multiple neural network at no additional training cost. This is achieved by letting a single neural network converge into several local minima along its optimization path and save the model parameters. The repeated rapid convergence is realized using multiple learning rate annealing cycles.
Figure 1: Left: Illustration of SGD optimization with a typical learning rate schedule. The model converges to a minimum at the end of training. Right: Illustration of Snapshot Ensembling optimization. The model undergoes several learning rate annealing cycles, converging to and escaping from multiple local minima. We take a snapshot at each minimum for test time ensembling.
- Install Torch ResNet (https://github.com/facebook/fb.resnet.torch);
- Clone the files to the
fb.resnet.torch/directory. Note that you need to replace
train.luawith the one from this repository;
- An example command to train a Snapshot Ensemble with ResNet-110 (B = 200 epochs, M = 5 cycles, Initial learning rate alpha = 0.2) on CIFAR-100:
th main.lua -netType resnet -depth 110 -dataset cifar100 -batchSize 64 -nEpochs 200 -lrShape cosine -nCycles 5 -LR 0.2 -save checkpoints/
[gh349, yl2363] at cornell.edu Any discussions, suggestions and questions are welcome!