New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[dist_optim] update the doc of DistributedOptimizer #51314
Conversation
updating the doc of DistributedOptimizer to include TorchScript enablement information [ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit ee114a2 (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
updating the doc of DistributedOptimizer to include TorchScript enablement information [ghstack-poisoned]
updating the doc of DistributedOptimizer to include TorchScript enablement information ghstack-source-id: 27bef4af612876d4424b82ede068eafdf9df1461 Pull Request resolved: #51314
Codecov Report
@@ Coverage Diff @@
## gh/wanchaol/158/base #51314 +/- ##
========================================================
- Coverage 80.84% 80.50% -0.35%
========================================================
Files 1931 1931
Lines 210884 210884
========================================================
- Hits 170493 169775 -718
- Misses 40391 41109 +718 |
torch/distributed/optim/optimizer.py
Outdated
@@ -150,6 +150,13 @@ class DistributedOptimizer: | |||
to the latest forward pass executed on a given worker. Also, there is no | |||
guaranteed ordering across workers. | |||
|
|||
DistributedOptimizer creates the local optimizer with TorchScript enabled |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: backticks for DistributedOptimizer
updating the doc of DistributedOptimizer to include TorchScript enablement information [ghstack-poisoned]
updating the doc of DistributedOptimizer to include TorchScript enablement information ghstack-source-id: 2fd88164f11550a21d8bf3e4f119c76e50641204 Pull Request resolved: #51314
Stack from ghstack:
updating the doc of DistributedOptimizer to include TorchScript enablement information
Differential Revision: D26156032