-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FSDP] Allow to use TorchDispatch with FSDP #88014
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/88014
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit b594741: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
ghstack-source-id: 22c02de0773c4106e1941f21025abbd70e9a68ca Pull Request resolved: #88014
Is this for landing, or is this just for experimentation? I am concerned that memory logs from using this will be inaccurate because memory will be incorrectly freed early without |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
look good to me, will let Andrew or Rohan to accept
@awgu |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the explanation! I definitely misunderstood.
This looks good to me. Do you know if Core team is planning to add support for record_stream()
so that this is a temporary fix?
Saw a TODO for TorchDispatch to support class block list filter. If that function is implemented, users can just use the block list to filter out |
Sounds good to me! Also, sorry, I am landing some PRs from my stack, so there may be some rebase conflicts :( |
Add `_no_dispatch_record_stream` to disable TorchDispatch before calling `record_stream()`. [ghstack-poisoned]
ghstack-source-id: 649546b87eec1a2565748efa48b41ecf4f722727 Pull Request resolved: #88014
Add `_no_dispatch_record_stream` to disable TorchDispatch before calling `record_stream()`. [ghstack-poisoned]
ghstack-source-id: 6786e65e5e34174604a12c73ac17646d59810710 Pull Request resolved: #88014
Add `_no_dispatch_record_stream` to disable TorchDispatch before calling `record_stream()`. [ghstack-poisoned]
ghstack-source-id: 7eb67be4760376db9ebb6fbcf01928fc0e92c3fd Pull Request resolved: #88014
Add `_no_dispatch_record_stream` to disable TorchDispatch before calling `record_stream()`. [ghstack-poisoned]
ghstack-source-id: c8504963f4e2c693c3ae0c68eacaf1378a655f14 Pull Request resolved: #88014
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Add `_no_dispatch_record_stream` to disable TorchDispatch before calling `record_stream()`. Pull Request resolved: pytorch#88014 Approved by: https://github.com/awgu
Add `_no_dispatch_record_stream` to disable TorchDispatch before calling `record_stream()`. Pull Request resolved: pytorch#88014 Approved by: https://github.com/awgu
Stack from ghstack (oldest at bottom):
Add
_no_dispatch_record_stream
to disable TorchDispatch before callingrecord_stream()
.