Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[dist_optim] update the doc of DistributedOptimizer #51314

Closed
wants to merge 3 commits into from

Conversation

wanchaol
Copy link
Contributor

@wanchaol wanchaol commented Jan 28, 2021

Stack from ghstack:

updating the doc of DistributedOptimizer to include TorchScript enablement information

Differential Revision: D26156032

updating the doc of DistributedOptimizer to include TorchScript enablement information

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Jan 28, 2021

💊 CI failures summary and remediations

As of commit ee114a2 (more details on the Dr. CI page):


  • 2/2 failures possibly* introduced in this PR
    • 2/2 non-CircleCI failure(s)

ci.pytorch.org: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

updating the doc of DistributedOptimizer to include TorchScript enablement information

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 28, 2021
updating the doc of DistributedOptimizer to include TorchScript enablement information

ghstack-source-id: 27bef4af612876d4424b82ede068eafdf9df1461
Pull Request resolved: #51314
@codecov
Copy link

codecov bot commented Jan 29, 2021

Codecov Report

Merging #51314 (ee114a2) into gh/wanchaol/158/base (d035d56) will decrease coverage by 0.34%.
The diff coverage is n/a.

@@                   Coverage Diff                    @@
##           gh/wanchaol/158/base   #51314      +/-   ##
========================================================
- Coverage                 80.84%   80.50%   -0.35%     
========================================================
  Files                      1931     1931              
  Lines                    210884   210884              
========================================================
- Hits                     170493   169775     -718     
- Misses                    40391    41109     +718     

@@ -150,6 +150,13 @@ class DistributedOptimizer:
to the latest forward pass executed on a given worker. Also, there is no
guaranteed ordering across workers.

DistributedOptimizer creates the local optimizer with TorchScript enabled
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: backticks for DistributedOptimizer

updating the doc of DistributedOptimizer to include TorchScript enablement information

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 29, 2021
updating the doc of DistributedOptimizer to include TorchScript enablement information

ghstack-source-id: 2fd88164f11550a21d8bf3e4f119c76e50641204
Pull Request resolved: #51314
@facebook-github-bot
Copy link
Contributor

@wanchaol merged this pull request in 662b6d2.

@facebook-github-bot facebook-github-bot deleted the gh/wanchaol/158/head branch February 2, 2021 15:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged oncall: distributed Add this issue/PR to distributed oncall triage queue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants