Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prioritize raising error message about unused parameters when rebuild_buckets fails #45933

Closed
wants to merge 4 commits into from

Conversation

rohan-varma
Copy link
Member

@rohan-varma rohan-varma commented Oct 6, 2020

Stack from ghstack:

Occasionally users run DDP with models with unused params, in this
case we would like to surface an error message telling them to run with
find_unused_params=True. However, a recent change to rebuild_buckets logic (#44798) made
it so that we raise a size mismatch error when this happens, but the
information about unused parameters is likely to be more useful and likely to
be the most common case of failure. Prefer raising this error over the
subsequent size mismatch errors.

Differential Revision: D24151256

…_buckets fails

Occasionally users run DDP with models with unused params, in this
case we would like to surface an error message telling them to run with
find_unused_params=True. However, a recent change to rebuild_buckets logic (#44798) made
it so that we raise a size mismatch error when this happens, but the
information about unused parameters is likely to be more useful and likely to
be the most common case of failure. Prefer raising this error over the
subsequent size mismatch errors.

Differential Revision: [D24151256](https://our.internmc.facebook.com/intern/diff/D24151256/)

[ghstack-poisoned]
@facebook-github-bot facebook-github-bot added the oncall: distributed Add this issue/PR to distributed oncall triage queue label Oct 6, 2020
…hen rebuild_buckets fails"

Occasionally users run DDP with models with unused params, in this
case we would like to surface an error message telling them to run with
find_unused_params=True. However, a recent change to rebuild_buckets logic (#44798) made
it so that we raise a size mismatch error when this happens, but the
information about unused parameters is likely to be more useful and likely to
be the most common case of failure. Prefer raising this error over the
subsequent size mismatch errors.

Differential Revision: [D24151256](https://our.internmc.facebook.com/intern/diff/D24151256/)

[ghstack-poisoned]
rohan-varma added a commit that referenced this pull request Oct 6, 2020
…_buckets fails

Pull Request resolved: #45933

Occasionally users run DDP with models with unused params, in this
case we would like to surface an error message telling them to run with
find_unused_params=True. However, a recent change to rebuild_buckets logic (#44798) made
it so that we raise a size mismatch error when this happens, but the
information about unused parameters is likely to be more useful and likely to
be the most common case of failure. Prefer raising this error over the
subsequent size mismatch errors.
ghstack-source-id: 113713050

Differential Revision: [D24151256](https://our.internmc.facebook.com/intern/diff/D24151256/)
Copy link
Contributor

@mrshenli mrshenli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

// has unused parameters for example, this will raise an error recommending to
// run with find_unused_parameters=True, instead of the size mismatch exception
// below.
ensure_prior_reduction_finished();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like self.reducer._rebuild_buckets is always invoked. Will it be sufficient if we move this to the front of the above if clause and then remove the other ensure_prior_reduction_finished invocation in prepare_for_backward?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I believe this should be okay. Originally I was concerned about require_backward_grad_sync flag, but I don't think its an issue. If we don't require_backward_grad_sync this iteration, we should still have the previous iteration reduction finished, where "previous iteration" here means the previous iteration that required grad sync, or the initial state of require_finalize_, which is false.

@codecov
Copy link

codecov bot commented Oct 7, 2020

Codecov Report

❗ No coverage uploaded for pull request base (gh/rohan-varma/180/base@8a1e100). Click here to learn what that means.
The diff coverage is 26.31%.

Impacted file tree graph

@@                    Coverage Diff                     @@
##             gh/rohan-varma/180/base   #45933   +/-   ##
==========================================================
  Coverage                           ?   68.25%           
==========================================================
  Files                              ?      410           
  Lines                              ?    53251           
  Branches                           ?        0           
==========================================================
  Hits                               ?    36349           
  Misses                             ?    16902           
  Partials                           ?        0           
Impacted Files Coverage Δ
.../testing/_internal/distributed/distributed_test.py 29.52% <26.31%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8a1e100...f650e34. Read the comment docs.

…hen rebuild_buckets fails"

Occasionally users run DDP with models with unused params, in this
case we would like to surface an error message telling them to run with
find_unused_params=True. However, a recent change to rebuild_buckets logic (#44798) made
it so that we raise a size mismatch error when this happens, but the
information about unused parameters is likely to be more useful and likely to
be the most common case of failure. Prefer raising this error over the
subsequent size mismatch errors.

Differential Revision: [D24151256](https://our.internmc.facebook.com/intern/diff/D24151256/)

[ghstack-poisoned]
…hen rebuild_buckets fails"

Occasionally users run DDP with models with unused params, in this
case we would like to surface an error message telling them to run with
find_unused_params=True. However, a recent change to rebuild_buckets logic (#44798) made
it so that we raise a size mismatch error when this happens, but the
information about unused parameters is likely to be more useful and likely to
be the most common case of failure. Prefer raising this error over the
subsequent size mismatch errors.

Differential Revision: [D24151256](https://our.internmc.facebook.com/intern/diff/D24151256/)

[ghstack-poisoned]
@dr-ci
Copy link

dr-ci bot commented Oct 8, 2020

💊 CI failures summary and remediations

As of commit ec9826a (more details on the Dr. CI page):


None of the CI failures appear to be your fault 💚



❄️ 1 failure tentatively classified as flaky

but reruns have not yet been triggered to confirm:

See CircleCI build pytorch_macos_10_13_py3_test (1/1)

Step: "Update Homebrew" (full log | diagnosis details | 🔁 rerun) ❄️

fatal: Could not read from remote repository.
remote: Total 60 (delta 51), reused 21 (delta 17), pack-reused 0         
Unpacking objects:  96% (58/60) Unpacking objects:  98% (59/60) Unpacking objects: 100% (60/60) Unpacking objects: 100% (60/60), 12.98 KiB | 309.00 KiB/s, done. 
From ssh://github.com/Homebrew/homebrew-cask-versions 
 + f127a2be3...882d4c8f3 master     -> origin/master  (forced update) 
+ git reset --hard origin/master 
HEAD is now at 882d4c8f3 vlc-nightly: update url do (#9707) 
+ for path in '$(find /usr/local/Homebrew -type d -name .git)' 
+ cd /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/.git/.. 
+ git fetch --depth=1 origin 
Connection to github.com closed by remote host.  
fatal: Could not read from remote repository. 
 
Please make sure you have the correct access rights 
and the repository exists. 

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 1 time.

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 62554a3.

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 62554a3.

@facebook-github-bot facebook-github-bot deleted the gh/rohan-varma/180/head branch October 13, 2020 14:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged oncall: distributed Add this issue/PR to distributed oncall triage queue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants