-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix #4375: Always use trainer.global_step for step #4376
Fix #4375: Always use trainer.global_step for step #4376
Conversation
Codecov Report
@@ Coverage Diff @@
## master #4376 +/- ##
======================================
Coverage 93% 93%
======================================
Files 117 117
Lines 9010 9010
======================================
Hits 8383 8383
Misses 627 627 |
Is there a problem logging it with epoch in case of epoch logging_level?? The reason I am asking this because it might not align with the global_step of other logs which are logged at epoch level since those are created at epoch end and this one at epoch start. |
I think it is right to log the learning rate at the beginning of the epoch with the |
@moi90 would you be able to take a look at failing tests? |
It seems that the failing tests are fixed by the changes merged from master. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested. LGTM!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Great, thanks for merging! |
What does this PR do?
This PR changes LearningRateMonitor to consistently use
step=trainer.global_step
, independently oflogging_interval
.Fixes #4375
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.