Skip to content
Gradually-Warmup Learning Rate Scheduler for PyTorch
Python
Branch: master
Clone or download
ildoonet Update setup.py
update version
Latest commit 89693ae Jan 19, 2020
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
asset typo fix. Jan 17, 2019
warmup_scheduler python3 support Jan 2, 2020
.gitignore Initial commit Jan 16, 2019
LICENSE Initial commit Jan 16, 2019
README.md Update README.md Oct 29, 2019
setup.py Update setup.py Jan 19, 2020

README.md

pytorch-gradual-warmup-lr

Gradually warm-up(increasing) learning rate for pytorch's optimizer. Proposed in 'Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour'.

example tensorboard

Example : Gradual Warmup for 100 epoch, after that, use cosine-annealing.

Install

$ pip install git+https://github.com/ildoonet/pytorch-gradual-warmup-lr.git

Usage

from warmup_scheduler import GradualWarmupScheduler

scheduler_cosine = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, max_epoch)
scheduler_warmup = GradualWarmupScheduler(optimizer, multiplier=8, total_epoch=10, after_scheduler=scheduler_cosine)

for epoch in range(train_epoch):
    scheduler_warmup.step()     # 10 epoch warmup, after that schedule as after_scheduler
    ...
You can’t perform that action at this time.