Skip to content

This repository aims to implement different types of learning rate decays.

License

Notifications You must be signed in to change notification settings

ChandraaShekar/learning-rate-decay-schedulers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Learning Rate Scheduler

License

This repository contains a collection of custom learning rate schedulers. These Scheduler classes can be used to decay the learning rate during training.

Installation

pip install numpy matplotlib

Available Schedulers


Explanation

Linear Decay Scheduler:

The learning rate is decayed linearly from the initial learning rate to minimum learning rate over a specified number of epochs. The formula for the learning rate is given below:

lr = lr_min + (lr_max - lr_min) * (1 - epoch / epochs)

Exponential Decay Scheduler:

The learning rate is decayed exponentially over a specified number of epochs. The formula for the learning rate is given below:

lr = lr_min + (lr_max - lr_min) * (decay_rate) ^ (epoch / epochs)

Step Decay Scheduler:

The learning rate is decayed by a factor of gamma every step_size epochs. The formula for the learning rate is given below:

lr = lr_max * gamma ^ (epoch / step_size)

Polynomial Decay Scheduler:

The learning rate is decayed by a factor of gamma every step_size epochs. The formula for the learning rate is given below:

lr = lr_max * (1 - epoch / epochs) ^ (power)

TODO

  • Add more schedulers
  • Create examples for each scheduler
  • Add documentation

Contributing

Everyone is welcome to contribute to this repository.

About

This repository aims to implement different types of learning rate decays.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages