You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When training with native AMP and a LR scheduler, we get a warning that indicates that a LR step has been taken when an optimizer step was skipped (expected at the beginning of the training with native AMP):
/usr/local/lib/python3.8/dist-packages/torch/optim/lr_scheduler.py:138: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
Environment
No response
More info
No response
The text was updated successfully, but these errors were encountered:
Bug description
When training with native AMP and a LR scheduler, we get a warning that indicates that a LR step has been taken when an optimizer step was skipped (expected at the beginning of the training with native AMP):
This can be fixed by wrapping these lines https://github.com/Lightning-AI/lightning/blob/574a9516012b4ab778254055c537f5d57e8e694f/src/pytorch_lightning/core/module.py#L1589-L1592
in
if hasattr(optimizer, '_step_count') and optimizer._step_count > 0
.Fix proposed in #16229
How to reproduce the bug
Error messages and logs
Environment
No response
More info
No response
The text was updated successfully, but these errors were encountered: