Skip to content
pytorch implement of Lookahead Optimizer
Python
Branch: master
Clone or download
Latest commit 8ebba8a Aug 28, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
png
README.md
nn.py
optimizer.py
progressbar.py
run.py
tools.py
trainingmonitor.py

README.md

Lookahead Pytorch

This repository contains a PyTorch implementation of the Lookahead Optimizer from the paper

Lookahead Optimizer: k steps forward, 1 step back

by Michael R. Zhang, James Lucas, Geoffrey Hinton and Jimmy Ba.

Dependencies

  • PyTorch
  • torchvision
  • matplotlib

Usage

The code in this repository implements both Lookahead and Adam training, with examples on the CIFAR-10 datasets.

To use Lookahead use the following command.

from optimizer import Lookahead
base_optimizer = optim.Adam(model.parameters(), lr=0.001)
optimizer = Lookahead(base_optimizer=base_optimizer,k=5,alpha=0.5)

Example

To produce th result,we use CIFAR-10 dataset for ResNet18.

# use adam
python run.py --optimizer=adam

# use lookahead 
python run.py --optimizer=lookahead

Results

Train loss of adam and lookahead with ResNet18 on CIFAR-10.

Valid loss of adam and lookahead with ResNet18 on CIFAR-10.

Valid accuracy of adam and lookahead with ResNet18 on CIFAR-10.

You can’t perform that action at this time.