Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FSDP][optim_state_dict] Make the new optimizer allgather fusion work with fine-tuning models #110540

Closed
wants to merge 7 commits into from

Conversation

fegin
Copy link
Contributor

@fegin fegin commented Oct 4, 2023

Stack from ghstack (oldest at bottom):

With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: D49922028

… with fine-tuning models

With use_orig_params=True, it is possible that some parameters in a FlatParameter are in the optimizer while others are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Oct 4, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/110540

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (3 Unrelated Failures)

As of commit 9135b1e with merge base 4069d1d (image):

FLAKY - The following job failed but was likely due to flakiness present on trunk:

UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

fegin added a commit that referenced this pull request Oct 4, 2023
… with fine-tuning models

With use_orig_params=True, it is possible that some parameters in a FlatParameter are in the optimizer while others are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

ghstack-source-id: 202913384
Pull Request resolved: #110540
…fusion work with fine-tuning models"

With use_orig_params=True, it is possible that some parameters in a FlatParameter are in the optimizer while others are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

[ghstack-poisoned]
fegin added a commit that referenced this pull request Oct 4, 2023
… with fine-tuning models

Pull Request resolved: #110540

With use_orig_params=True, it is possible that some parameters in a FlatParameter are in the optimizer while others are frozen. This PR makes the allgather fusion logic support the case.
ghstack-source-id: 202914577
@exported-using-ghexport

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)
…fusion work with fine-tuning models"

With use_orig_params=True, it is possible that some parameters in a FlatParameter are in the optimizer while others are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

[ghstack-poisoned]
fegin added a commit that referenced this pull request Oct 4, 2023
… with fine-tuning models

Pull Request resolved: #110540

With use_orig_params=True, it is possible that some parameters in a FlatParameter are in the optimizer while others are frozen. This PR makes the allgather fusion logic support the case.
ghstack-source-id: 202915043
@exported-using-ghexport

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)
@fegin fegin added ciflow/trunk Trigger trunk jobs on your pull request ciflow/periodic Trigger jobs ran periodically on master (periodic.yml) on the PR labels Oct 4, 2023
…fusion work with fine-tuning models"


With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

[ghstack-poisoned]
fegin added a commit that referenced this pull request Oct 4, 2023
… with fine-tuning models

Pull Request resolved: #110540

With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.
ghstack-source-id: 202917259
@exported-using-ghexport

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)
torch/distributed/fsdp/_optim_utils.py Outdated Show resolved Hide resolved
torch/distributed/fsdp/_optim_utils.py Outdated Show resolved Hide resolved
torch/distributed/fsdp/_optim_utils.py Outdated Show resolved Hide resolved
# 1.) the rank does not own any part of the original parameter.
# As a result, there is no corresponding optimizer state on
# the rank as well.
# 2.) the parameter is frozen and no optimizer state for the
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This case can happen when the parameter changes between frozen and non-frozen, so it is currently frozen but still has optimizer state?

Copy link
Member

@rohan-varma rohan-varma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar question to @awgu - what happens when a parameter transitions state from frozen -> nonfrozen, or vice-versa during training?

…fusion work with fine-tuning models"


With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

[ghstack-poisoned]
fegin added a commit that referenced this pull request Oct 4, 2023
… with fine-tuning models

Pull Request resolved: #110540

With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.
ghstack-source-id: 202943644
@exported-using-ghexport

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)
@fegin
Copy link
Contributor Author

fegin commented Oct 4, 2023

The latest version supports the no_grad -> grad -> no_grad transition and changes the UT to verify the usage.

…fusion work with fine-tuning models"


With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

[ghstack-poisoned]
fegin added a commit that referenced this pull request Oct 4, 2023
… with fine-tuning models

Pull Request resolved: #110540

With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.
ghstack-source-id: 202944435
@exported-using-ghexport

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)
…fusion work with fine-tuning models"


With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)

[ghstack-poisoned]
fegin added a commit that referenced this pull request Oct 4, 2023
… with fine-tuning models

Pull Request resolved: #110540

With use_orig_params=True, it is possible that some parameters with the same FlatParameter are in the optimizer while others parameters are frozen. This PR makes the allgather fusion logic support the case.
ghstack-source-id: 202957449
@exported-using-ghexport

Differential Revision: [D49922028](https://our.internmc.facebook.com/intern/diff/D49922028/)
@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@facebook-github-bot facebook-github-bot deleted the gh/fegin/154/head branch October 9, 2023 14:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/periodic Trigger jobs ran periodically on master (periodic.yml) on the PR ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: distributed (fsdp) release notes category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants