Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow trainer to work with mutliple learning rates #2641

Merged
merged 7 commits into from
Feb 25, 2022

Conversation

plonerma
Copy link
Collaborator

This PR enables the train function of ModelTrainer to work with multiple learning rates.

Prior, the train function assumed the usage of a single learning rate. When using a LR schedule (like OneCycleLR), group specific learning rates were overwritten.
With this PR, it is possible to pass an optimizer instance with multiple parameter groups and learning rates and still use scheduling without overwriting group specific learning rates.

See the pytorch documentation for details on how to initialize an optimizer with multiple learning rates.

@alanakbik
Copy link
Collaborator

@plonerma thanks for adding this!

@alanakbik alanakbik merged commit 0490121 into master Feb 25, 2022
@alanakbik alanakbik deleted the multiple_lr_trainer branch February 25, 2022 14:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants