New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FSDP][optim_state_dict] Skip the parameter if the parameter does not belong to the current FSDP instance #112804
Conversation
… belong to the current FSDP instance Skip the fsdp managed parameter if the parameter is not managed by the current FSDP instance. This can happen if the not all FSDP instances have all the parameters. This can happen with FSDP + some MPMD style parallelism. Differential Revision: [D50562170](https://our.internmc.facebook.com/intern/diff/D50562170/) [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/112804
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (3 Unrelated Failures)As of commit 53239d9 with merge base 623a311 (): FLAKY - The following job failed but was likely due to flakiness present on trunk:
UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
… belong to the current FSDP instance Skip the fsdp managed parameter if the parameter is not managed by the current FSDP instance. This can happen if the not all FSDP instances have all the parameters. This can happen with FSDP + some MPMD style parallelism. Differential Revision: [D50562170](https://our.internmc.facebook.com/intern/diff/D50562170/) ghstack-source-id: 206330212 Pull Request resolved: #112804
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
…er does not belong to the current FSDP instance" Skip the fsdp managed parameter if the parameter is not managed by the current FSDP instance. This can happen if the not all FSDP instances have all the parameters. This can happen with FSDP + some MPMD style parallelism. Differential Revision: [D50562170](https://our.internmc.facebook.com/intern/diff/D50562170/) [ghstack-poisoned]
… belong to the current FSDP instance Pull Request resolved: #112804 Skip the fsdp managed parameter if the parameter is not managed by the current FSDP instance. This can happen if the not all FSDP instances have all the parameters. This can happen with FSDP + some MPMD style parallelism. ghstack-source-id: 206443558 @exported-using-ghexport Differential Revision: [D50562170](https://our.internmc.facebook.com/intern/diff/D50562170/)
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
… belong to the current FSDP instance (pytorch#112804) Skip the fsdp managed parameter if the parameter is not managed by the current FSDP instance. This can happen if the not all FSDP instances have all the parameters. This can happen with FSDP + some MPMD style parallelism. Differential Revision: [D50562170](https://our.internmc.facebook.com/intern/diff/D50562170/) Pull Request resolved: pytorch#112804 Approved by: https://github.com/wz337
… belong to the current FSDP instance (pytorch#112804) Skip the fsdp managed parameter if the parameter is not managed by the current FSDP instance. This can happen if the not all FSDP instances have all the parameters. This can happen with FSDP + some MPMD style parallelism. Differential Revision: [D50562170](https://our.internmc.facebook.com/intern/diff/D50562170/) Pull Request resolved: pytorch#112804 Approved by: https://github.com/wz337
Stack from ghstack (oldest at bottom):
Skip the fsdp managed parameter if the parameter is not managed by the current FSDP instance. This can happen if the not all FSDP instances have all the parameters. This can happen with FSDP + some MPMD style parallelism.
Differential Revision: D50562170