New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] ZeroRedundancyOptimizer ambiguous error with param groups when pytorch < 1.12.0 #818
[Fix] ZeroRedundancyOptimizer ambiguous error with param groups when pytorch < 1.12.0 #818
Conversation
Codecov ReportBase: 78.66% // Head: 78.86% // Increases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## main #818 +/- ##
==========================================
+ Coverage 78.66% 78.86% +0.19%
==========================================
Files 128 128
Lines 9348 9368 +20
Branches 1848 1857 +9
==========================================
+ Hits 7354 7388 +34
+ Misses 1679 1666 -13
+ Partials 315 314 -1
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
Please update unit tests in
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
1228931
to
11a8770
Compare
11a8770
to
b2ac889
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should also test has the overriden function state_dict
worked
…pytorch < 1.12.0 (open-mmlab#818) * fix zero_optimizer error with param groups when pytorch < 1.12.0 * add docstring * fix docstring * add unittest * change ut to use a valid paramwise_cfg * modify ut * fix as comments
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
See issue #778
Modification
Add check for param groups and torch version, and give a better instruction to users.
BC-breaking (Optional)
No
Use cases (Optional)
Same as before
Checklist