New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix lr scheduler state not being dumped to checkpoint in deepspeed strategy #11307
Conversation
8b7865c
to
682e06d
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing this @awaelchli. I assume based on the logic, the scheduler will be loaded if found in the state dict elsewhere in the code
I'm not sure that's the case, since the Deepspeed plugin has
|
Yes you are right, this needs to be taken care of too. |
Right. I tried with current commits. Though |
26766bb
to
395d62a
Compare
Codecov Report
@@ Coverage Diff @@
## master #11307 +/- ##
========================================
- Coverage 92% 88% -4%
========================================
Files 194 194
Lines 16965 16965
========================================
- Hits 15593 14960 -633
- Misses 1372 2005 +633 |
What does this PR do?
Part of #11188
The added assertion fails on master.
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
I made sure I had fun coding 🙃
Part of #1 (it's a lie, this is just here to avoid noisy GitHub bot)
cc @Borda @SeanNaren @awaelchli @rohitgr7