-
Notifications
You must be signed in to change notification settings - Fork 25.6k
Default specialize_int to False #96624
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Edward Z. Yang <ezyang@meta.com> [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/96624
Note: Links to docs will display an error until the docs builds have been completed. ❌ 3 FailuresAs of commit 28ab146: NEW FAILURES - The following jobs have failed:
BROKEN TRUNK - The following jobs failed but were present on the merge base ba4fb9b:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Signed-off-by: Edward Z. Yang <ezyangmeta.com> cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire [ghstack-poisoned]
(config, "dynamic_shapes", True), | ||
(config, "assume_static_by_default", static_default), | ||
(config, "specialize_int", not unspec), | ||
(config, "specialize_int", static_default), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Question: This code pairs assume_static_by_default with specialize_int; both are True or both are False (the default). Is it also conceivable to have one be True while the other False? The config in line 86 used to be assume_static_by_default = False and specialize_int = True and now means both = False.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I have reduced coverage. This is a calculated decision: assume static by default is heavily correlated with specializing ints, as these are correlated with export usage. In eager usage, we expect unspec int and not assuming static by default. There's two configs that are not tested, but I think testing them is unnecessary as we will get adequate coverage otherwise.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks fine to me then
@pytorchbot merge -f "spurious lint failure on master" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Looks like this broke inductor/test_torchinductor_dynamic_shapes.py https://hud.pytorch.org/pytorch/pytorch/commit/1ac8782db244f6cd3d2fd109e3fe94500745e0dd |
@pytorchbot revert -m "Broke inductor/test_torchinductor_dynamic_shapes.py" -c nosignal |
@pytorchbot successfully started a revert job. Check the current status here. |
@ezyang your PR has been successfully reverted. |
This reverts commit 1ac8782. Reverted #96624 on behalf of https://github.com/kit1980 due to Broke inductor/test_torchinductor_dynamic_shapes.py
Signed-off-by: Edward Z. Yang <ezyangmeta.com> cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire [ghstack-poisoned]
Signed-off-by: Edward Z. Yang <ezyangmeta.com> cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire [ghstack-poisoned]
Signed-off-by: Edward Z. Yang <ezyangmeta.com> cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire [ghstack-poisoned]
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 1 mandatory check(s) failed. The first few are: Dig deeper by viewing the failures on hud |
@pytorchbot merge -f "failures look unrelated" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Signed-off-by: Edward Z. Yang <ezyang@meta.com> Pull Request resolved: pytorch/pytorch#96624 Approved by: https://github.com/janeyx99
This reverts commit 1ac8782. Reverted pytorch/pytorch#96624 on behalf of https://github.com/kit1980 due to Broke inductor/test_torchinductor_dynamic_shapes.py
Signed-off-by: Edward Z. Yang <ezyang@meta.com> Pull Request resolved: pytorch/pytorch#96624 Approved by: https://github.com/janeyx99
Signed-off-by: Edward Z. Yang <ezyang@meta.com> Pull Request resolved: pytorch/pytorch#96624 Approved by: https://github.com/janeyx99
This reverts commit 1ac8782. Reverted pytorch/pytorch#96624 on behalf of https://github.com/kit1980 due to Broke inductor/test_torchinductor_dynamic_shapes.py
Signed-off-by: Edward Z. Yang <ezyang@meta.com> Pull Request resolved: pytorch/pytorch#96624 Approved by: https://github.com/janeyx99
Stack from ghstack (oldest at bottom):
Signed-off-by: Edward Z. Yang ezyang@meta.com
cc @soumith @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @peterbell10 @desertfire