Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FSDP] Option to keep grads in lower precision #85062

Closed
wants to merge 1 commit into from

Conversation

rohan-varma
Copy link
Member

@rohan-varma rohan-varma commented Sep 15, 2022

Stack from ghstack (oldest at bottom):

Rehash of a similar PR from a month ago that got stale. Adds a config to FSDP MP so that gradients can be kept in lower precision, to support optimizers such as AnyPrecisionOptimizer which would like to keep grads in bf16.

To do this, for sharded cases, we cannot simply omit the cast back to the full precision param dtype, otherwise when setting p.grad = p._saved_grad_shard in finalize_params, autograd will throw an error indicating that the grad dtype should match the param dtype when it is being set.

As a workaround, we re-cast after setting this. Although, this means that for cases that use gradient accumulation, p._saved_grad_shard will be of the reduced dtype because it is set to p.grad in _prep_grad_for_backward. As a result, add a check + recast here as well.

Differential Revision: D39529117

@pytorch-bot
Copy link

pytorch-bot bot commented Sep 15, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/85062

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures, 16 Pending

As of commit 336055f:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: distributed (fsdp) release notes category label Sep 15, 2022
@facebook-github-bot facebook-github-bot added the oncall: distributed Add this issue/PR to distributed oncall triage queue label Sep 15, 2022
rohan-varma added a commit that referenced this pull request Sep 15, 2022
Differential Revision: [D39529117](https://our.internmc.facebook.com/intern/diff/D39529117/)

ghstack-source-id: 167439055
Pull Request resolved: #85062
@rohan-varma
Copy link
Member Author

#85134

@facebook-github-bot facebook-github-bot deleted the gh/rohan-varma/591/head branch June 8, 2023 18:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed oncall: distributed Add this issue/PR to distributed oncall triage queue release notes: distributed (fsdp) release notes category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants