Learning Rate Schedulers update the learning rate over the course of training. Learning rates can be updated after each update via step_update
or at epoch boundaries via step
.
fairseq.optim.lr_scheduler
fairseq.optim.lr_scheduler.FairseqLRScheduler
fairseq.optim.lr_scheduler.cosine_lr_scheduler.CosineSchedule
fairseq.optim.lr_scheduler.fixed_schedule.FixedSchedule
fairseq.optim.lr_scheduler.inverse_square_root_schedule.InverseSquareRootSchedule
fairseq.optim.lr_scheduler.reduce_lr_on_plateau.ReduceLROnPlateau
fairseq.optim.lr_scheduler.triangular_lr_scheduler.TriangularSchedule