-
Notifications
You must be signed in to change notification settings - Fork 25.3k
[c10d] C10d release to torch.distributed for PT1 #11405
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
1aea8d8
to
f8ce61d
Compare
642ded8
to
83ceacc
Compare
@pytorchbot retest this please |
b54ba1a
to
864bc53
Compare
@pytorchbot retest this please |
1 similar comment
@pytorchbot retest this please |
b060db9
to
f028081
Compare
@pietern This PR should be good to go. Please review/stamp |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
teng-li has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Please make sure to merge the gradient normalization change from here: #11109 |
f028081
to
60834e1
Compare
60834e1
to
c1f4720
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
teng-li has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: The old `torch.distributed` will go to `torch.distributed.deprecated` The old DDP will go to `torch.nn.parallel.deprecated` Now `torch.nn.parallel.DDP` will use c10d DDP Now `torch.distributed` will use C10d frontend API Pull Request resolved: pytorch#11405 Reviewed By: pietern Differential Revision: D9733733 Pulled By: teng-li fbshipit-source-id: d6a3f3e73f8d3a7fcb1f4baef53c78063b8cbb08
…ibuted doc (#11450) Summary: This is the new documentation for c10d release, and it also deprecates the old torch.distributed document. This PR depends on #11405 and should only be landed after #11405 is landed Pull Request resolved: #11450 Differential Revision: D9765504 Pulled By: teng-li fbshipit-source-id: 48f38b27b8c270baf389f8e478ea226b9ecc63db
It's been ~9 months since moving THD to the torch.distributed.deprecated namespace (see pytorch#11405) and we haven't seen issues related to it, so it's time to remove it. Closes pytorch#18967.
Summary: It's been ~9 months since moving THD to the `torch.distributed.deprecated` namespace (see #11405) and we haven't seen issues related to it, so it's time to remove it. Closes #18967. Pull Request resolved: #22065 Reviewed By: mrshenli Differential Revision: D15983669 Pulled By: pietern fbshipit-source-id: 2a2f5866f9a63040bc7cef3956d5fd215aba7165
Summary: It's been ~9 months since moving THD to the `torch.distributed.deprecated` namespace (see pytorch/pytorch#11405) and we haven't seen issues related to it, so it's time to remove it. Closes pytorch/pytorch#18967. Pull Request resolved: pytorch/pytorch#22065 Reviewed By: mrshenli Differential Revision: D15983669 Pulled By: pietern fbshipit-source-id: 2a2f5866f9a63040bc7cef3956d5fd215aba7165
The old
torch.distributed
will go totorch.distributed.deprecated
The old DDP will go to
torch.nn.parallel.deprecated
Now
torch.nn.parallel.DDP
will use c10d DDPNow
torch.distributed
will use C10d frontend API