Skip to content

Set the learning rate of each parameter group using a cosine annealing schedule.

License

Notifications You must be signed in to change notification settings

gurucharanmk/PyTorch_CosineAnnealingWithRestartsLR

Repository files navigation

PyTorch_CosineAnnealingWithRestartsLR (WIP)

To set the learning rate of each parameter group using a cosine annealing schedule.

This repository is contains the core implementation of SGDR: Stochastic Gradient Descent with Warm Restarts. Please refer the original paper(https://arxiv.org/pdf/1608.03983.pdf) for more details

About

Set the learning rate of each parameter group using a cosine annealing schedule.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages