-
Notifications
You must be signed in to change notification settings - Fork 25.6k
[export] make with_effect mark op has_effect to prevent them from DCEed. #129680
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/129680
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ✅ No FailuresAs of commit 498df79 with merge base 5ceba6a ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
…ct_token" Before the PR, custom ops that doesn't return outputs will gets eliminated after calling `.module()` because the effect_token that keeps the operator not DCEed is removed in remove_effect_token. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, the causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph. This PR removes the DCE calls in unlift and remove_effect_token to avoid this, which may not be a good idea. The idea behind is that the passes in export should never call eliminate dead code so that we could put remote_effect_token anywhere we want. The alternative is that we should carefully put remove_effect_token as the last transformation to avoid any transformations to eliminate them. But in the case of unlift, we have to remove_effect_token before running unlift in order to keep the signature match. Test Plan: Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op [ghstack-poisoned]
…ct_token" Before the PR, custom ops that doesn't return outputs will gets eliminated after calling `.module()` because the effect_token that keeps the operator not DCEed is removed in remove_effect_token. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, the causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph. This PR removes the DCE calls in unlift and remove_effect_token to avoid this, which may not be a good idea. The idea behind is that the passes in export should never call eliminate dead code so that we could put remote_effect_token anywhere we want. The alternative is that we should carefully put remove_effect_token as the last transformation to avoid any transformations to eliminate them. But in the case of unlift, we have to remove_effect_token before running unlift in order to keep the signature match. Test Plan: Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op [ghstack-poisoned]
…event them from DCEed." Before the PR, custom ops that doesn't return outputs will gets eliminated after calling `.module()` because the effect_token that keeps the operator not DCEed is removed in remove_effect_token. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, the causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph. This PR calls has_side_effect in with_effect to make sure graph.eliminate_dead_code doesn't remove the calls by accident. Test Plan: Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op [ghstack-poisoned]
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
@pytorchbot revert -m "breaking internal builds, see D59181183" -c ghfirst |
@pytorchbot successfully started a revert job. Check the current status here. |
@ydwu4 your PR has been successfully reverted. |
…from DCEed. (#129680)" This reverts commit 4b8a5e0. Reverted #129680 on behalf of https://github.com/kit1980 due to breaking internal builds, see D59181183 ([comment](#129680 (comment)))
@ydwu4 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
…em from DCEed." Before the PR, custom ops that don't return outputs will get eliminated after calling `.module()` because the effect_token that keeps the operator alive is removed in remove_effect_token pass. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, this causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph. This PR calls has_side_effect in with_effect to make sure graph.eliminate_dead_code doesn't remove the calls by accident. Test Plan: Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op Differential Revision: [D59498728](https://our.internmc.facebook.com/intern/diff/D59498728) [ghstack-poisoned]
@ydwu4 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
…em from DCEed." Before the PR, custom ops that don't return outputs will get eliminated after calling `.module()` because the effect_token that keeps the operator alive is removed in remove_effect_token pass. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, this causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph. This PR calls has_side_effect in with_effect to make sure graph.eliminate_dead_code doesn't remove the calls by accident. Test Plan: Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op Differential Revision: [D59498728](https://our.internmc.facebook.com/intern/diff/D59498728) [ghstack-poisoned]
@ydwu4 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
…em from DCEed." Before the PR, custom ops that don't return outputs will get eliminated after calling `.module()` because the effect_token that keeps the operator alive is removed in remove_effect_token pass. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, this causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph. This PR calls has_side_effect in with_effect to make sure graph.eliminate_dead_code doesn't remove the calls by accident. Test Plan: Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op Differential Revision: [D59498728](https://our.internmc.facebook.com/intern/diff/D59498728) [ghstack-poisoned]
@ydwu4 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@pytorchbot merge |
Merge failedReason: This PR has internal changes and must be landed via Phabricator Details for Dev Infra teamRaised by workflow job |
@pytorchbot merge -f "landed internally" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…ed. (pytorch#129680) Before the PR, custom ops that don't return outputs will get eliminated after calling `.module()` because the effect_token that keeps the operator alive is removed in remove_effect_token pass. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, this causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph. This PR calls has_side_effect in with_effect to make sure graph.eliminate_dead_code doesn't remove the calls by accident. Test Plan: Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op Differential Revision: [D59498728](https://our.internmc.facebook.com/intern/diff/D59498728) Pull Request resolved: pytorch#129680 Approved by: https://github.com/angelayi
Stack from ghstack (oldest at bottom):
Before the PR, custom ops that don't return outputs will get eliminated after calling
.module()
because the effect_token that keeps the operator alive is removed in remove_effect_token pass. The reason why we want to remove_effect_token is because we don't want the token to be part of input. However, this causes DCE calls in remove_effect_token itself and the dce calls in unlift to remove the custom op in the graph causing an error in the exported graph.This PR calls has_side_effect in with_effect to make sure graph.eliminate_dead_code doesn't remove the calls by accident.
Test Plan:
Add a new test pytest test/export/test_torchbind.py -k test_export_inplace_custom_op
Differential Revision: D59498728