-
Notifications
You must be signed in to change notification settings - Fork 22.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
optim.lr_scheduler.CyclicLR (master only: not released) is buggy when not using momentum #19003
Comments
Let me know if you'd like me to work on this 👨💻 |
@Youssefares sure thing, go for it. Thank you! |
This bug is still present in the official PyTorch 1.1.0 which was released Apr 30, 2019. It's a really bad bug since CyclicLR is a big feature of the new pytorch and this bug causes CyclicLR to fail on any optimizer which doesn't support momentum. It's a fairly easy fix too, just indent two lines of code. |
When will this bug be fixed? |
This bug is still present. Is anyone working to fix this ? If not, maybe I can work on it. |
@soumith and @Youssefares |
@memahesh |
I think you are correct, if you indent the second and third to last lines in the init it works. |
Fixed in #20401. |
Hmm. Is anyone getting this error in |
I use PyTorch 1.4, but I still get the following error when using Adam without momentum: What I do now instead is to use SGD with momentum ... |
Oh, okay. So basically there are some compatibility issues with the |
Exactly. At least it seems so. I hope this will be fixed at some point. :-) |
The bug exists in pytorch 1.6 as well. |
The bug exists in PyTorch 1.7.0 😞 |
Is there anyone who knows why this issue is closed although this bug exists in the latest stable release(1.7.0)?? Or is there a solution for this phenomenon except for setting |
Any updates for this? |
Is someone currently working on this ? (This issue is not solved) |
The bug exists in PyTorch 1.9.0 |
@gchanan Please reopen this issue. It is not fixed. |
Does someone know/can explain me how to fix it ? I can try to make a PR |
The bug exists in PyTorch 2.0.0 |
Issue description
If I use an optimizer like adam with no momentum and follow the message of line 578 below here, passing
cycle_momentum=False
, line 584 throws a KeyError because key 'momentum' is not set.pytorch/torch/optim/lr_scheduler.py
Lines 576 to 585 in 173f224
Code example
System Info
In my setup, I am using the stable version of pytorch, but I copied over CyclicLR and relative imports from master and I am using it in my project.
The text was updated successfully, but these errors were encountered: