Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update base learning rate scheduler #3167

Merged
merged 2 commits into from
Jan 8, 2023
Merged

Update base learning rate scheduler #3167

merged 2 commits into from
Jan 8, 2023

Conversation

eb8680
Copy link
Member

@eb8680 eb8680 commented Dec 20, 2022

Resolves #3166

PyTorch recently changed the name of the base class used in pyro.optim to identify and wrap the learning rate schedulers in torch.optim from _LRScheduler to LRScheduler, leading to a silent failure to create wrappers.

This PR adds a backwards-compatible check to the wrapping logic to handle newer PyTorch releases.

fritzo
fritzo previously approved these changes Dec 20, 2022
Copy link
Member

@fritzo fritzo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing this!

@fritzo
Copy link
Member

fritzo commented Dec 20, 2022

Could you make format and try again? Looks like black wants to simplify now that one line is shorter

@fritzo fritzo merged commit 19e32df into dev Jan 8, 2023
@simonangerbauer
Copy link

simonangerbauer commented Mar 15, 2023

Hi @fritzo,
any idea when this will be released?
PyTorch 2.0.0 has been released and therefore this change is already necessary to keep the LRScheduler wrappers compatible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

bug with OneCycleLR on Apple Silicone
3 participants