-
Notifications
You must be signed in to change notification settings - Fork 25.6k
add private config to temporarily preserve old FSDP guard behavior #142871
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/142871
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 New Failures, 1 Unrelated FailureAs of commit 9025747 with merge base 04bb82f ( NEW FAILURES - The following jobs have failed:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D67096751 |
cf88b91
to
6d55da8
Compare
This pull request was exported from Phabricator. Differential Revision: D67096751 |
6d55da8
to
2855c85
Compare
…ytorch#142871) Summary: pytorch#138819 wobbled dynamo guards in a way that caused some performance regression, so this PR temporarily adds a config to get the old behavior back while we investigate. Test Plan: CI Differential Revision: D67096751
2855c85
to
382ad17
Compare
This pull request was exported from Phabricator. Differential Revision: D67096751 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D67096751 |
382ad17
to
14a8bf5
Compare
This pull request was exported from Phabricator. Differential Revision: D67096751 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D67096751 |
14a8bf5
to
f99d11d
Compare
This pull request was exported from Phabricator. Differential Revision: D67096751 |
f99d11d
to
724e260
Compare
…ytorch#142871) Summary: Pull Request resolved: pytorch#142871 pytorch#138819 wobbled dynamo guards in a way that caused some performance regression, so this PR temporarily adds a config to get the old behavior back while we investigate. Test Plan: CI Reviewed By: yf225, yanboliang Differential Revision: D67096751
…ytorch#142871) Summary: pytorch#138819 wobbled dynamo guards in a way that caused some performance regression, so this PR temporarily adds a config to get the old behavior back while we investigate. Test Plan: CI Reviewed By: yf225, yanboliang Differential Revision: D67096751
724e260
to
9025747
Compare
This pull request was exported from Phabricator. Differential Revision: D67096751 |
@pytorchbot merge -i |
Merge startedYour change will be merged while ignoring the following 3 checks: inductor-rocm / rocm6.2-py3.10-inductor / test (inductor, 1, 2, linux.rocm.gpu.2), inductor-rocm / rocm6.2-py3.10-inductor / test (inductor, 2, 2, linux.rocm.gpu.2), pull / linux-jammy-py3.10-clang15-asan / test (default, 5, 6, lf.linux.4xlarge) Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary: #138819 wobbled dynamo guards in a way that caused some performance regression, so this PR temporarily adds a config to get the old behavior back while we investigate.
Test Plan: CI
Differential Revision: D67096751
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng @chauhang @amjames