-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Skip scheduler.step()
if optimizer.step()
isn't called in the iteration
#9923
Conversation
scheduler.step()
called in the wrong order with precsion=16
scheduler.step()
called in the wrong order with precsion=16
When the warning is raised This issue only happens when def configure_optimizers(self):
optimizer = ...
scheduler = {
"scheduler": ...,
"interval": "step",
"frequency": 1, # another small number may also cause this issue.
}
return {"optimizer": optimizer, "lr_scheduler": scheduler} Cause of the warning EDIT (2021-10-28): native amp skips pytorch/pytorch#44511 |
scheduler.step()
called in the wrong order with precsion=16
scheduler.step()
called before optimizer.step()
with native amp and "interval": "step"
scheduler.step()
called before optimizer.step()
with native amp and "interval": "step"
scheduler.step()
if optimizer.step()
is never called
scheduler.step()
when optimizer.step()
isn't calledscheduler.step()
when optimizer.step()
isn't called
scheduler.step()
when optimizer.step()
isn't calledscheduler.step()
if optimizer.step()
isn't called in the iteration
still wip |
What does this PR do?
Fixes #5558
Does your PR introduce any breaking changes? If yes, please list them.
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃
cc @carmocca @justusschock @awaelchli @akihironitta