Skip to content
Gradually-Warmup Learning Rate Scheduler for PyTorch
Branch: master
Clone or download
ildoonet Update
update version
Latest commit 89693ae Jan 19, 2020
Type Name Latest commit message Commit time
Failed to load latest commit information.
asset typo fix. Jan 17, 2019
warmup_scheduler python3 support Jan 2, 2020
.gitignore Initial commit Jan 16, 2019
LICENSE Initial commit Jan 16, 2019 Update Oct 29, 2019 Update Jan 19, 2020


Gradually warm-up(increasing) learning rate for pytorch's optimizer. Proposed in 'Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour'.

example tensorboard

Example : Gradual Warmup for 100 epoch, after that, use cosine-annealing.


$ pip install git+


from warmup_scheduler import GradualWarmupScheduler

scheduler_cosine = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, max_epoch)
scheduler_warmup = GradualWarmupScheduler(optimizer, multiplier=8, total_epoch=10, after_scheduler=scheduler_cosine)

for epoch in range(train_epoch):
    scheduler_warmup.step()     # 10 epoch warmup, after that schedule as after_scheduler
You can’t perform that action at this time.