-
Notifications
You must be signed in to change notification settings - Fork 21.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[symbolic shapes] if symbol not in var_ranges default to unknown range #127681
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/127681
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (2 Unrelated Failures)As of commit a4564f4 with merge base 2e77916 ( UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D58048558 |
torch/_inductor/index_propagation.py
Outdated
) | ||
return expr | ||
except KeyError as e: | ||
log.warn(str(e)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use 'log.warning("", exc_info=True)'
ec27fea
to
d06fe3f
Compare
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
This pull request was exported from Phabricator. Differential Revision: D58048558 |
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
d06fe3f
to
a628b72
Compare
This pull request was exported from Phabricator. Differential Revision: D58048558 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would really like to understand #127677 (comment), as this is just a hint that there's another bug lurking underneath.
@lezcano |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, let's land this as a hotfix and leave the other issue open.
vr = self.var_to_range[k] | ||
except KeyError: | ||
log.warning(f"{k} is not in var_to_range, defaulting to unknown range.") | ||
vr = ValueRanges(-sys.maxsize, sys.maxsize) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use self._default_unspecified_value_range()
then, as this one has an off-by-one error.
sounds good, assigned myself to the issue |
a628b72
to
106e7c5
Compare
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
This pull request was exported from Phabricator. Differential Revision: D58048558 |
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
106e7c5
to
0f7b058
Compare
This pull request was exported from Phabricator. Differential Revision: D58048558 |
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
83b78c0
to
0a23541
Compare
This pull request was exported from Phabricator. Differential Revision: D58048558 |
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
0a23541
to
934b94c
Compare
This pull request was exported from Phabricator. Differential Revision: D58048558 |
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
934b94c
to
7656231
Compare
This pull request was exported from Phabricator. Differential Revision: D58048558 |
…nge (pytorch#127681) Summary: Issue: pytorch#127677 Test Plan: ci --- Differential Revision: D58048558
7656231
to
a4564f4
Compare
This pull request was exported from Phabricator. Differential Revision: D58048558 |
@pytorchbot merge -f 'Landed internally' (Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally) |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
With recently nighlty I've just got many of this warning. W0605 11:48:51.002000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:51.002000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:51.013000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:51.013000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:51.324000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:51.324000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:51.334000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:51.334000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:52.669000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:52.800000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:54.093000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:54.093000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:54.152000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:54.152000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:54.734000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:54.869000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:56.124000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:56.124000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:56.189000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z1 is not in var_ranges, defaulting to unknown range.
W0605 11:48:56.189000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:57.112000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] q0 is not in var_ranges, defaulting to unknown range.
W0605 11:48:57.240000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] z0 is not in var_ranges, defaulting to unknown range.
W0605 11:49:03.713000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] y1 is not in var_ranges, defaulting to unknown range.
W0605 11:49:03.713000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] y0 is not in var_ranges, defaulting to unknown range.
W0605 11:49:03.725000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] y1 is not in var_ranges, defaulting to unknown range.
W0605 11:49:03.725000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] y0 is not in var_ranges, defaulting to unknown range.
W0605 11:49:04.647000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] x0 is not in var_ranges, defaulting to unknown range.
W0605 11:49:05.445000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] x2 is not in var_ranges, defaulting to unknown range.
W0605 11:49:05.445000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] x1 is not in var_ranges, defaulting to unknown range.
W0605 11:49:05.923000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] x0 is not in var_ranges, defaulting to unknown range.
W0605 11:49:06.703000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] x2 is not in var_ranges, defaulting to unknown range.
W0605 11:49:06.703000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] x1 is not in var_ranges, defaulting to unknown range.
W0605 11:49:07.186000 134494347568960 torch/fx/experimental/symbolic_shapes.py:4417] [0/0] x0 is not in var_ranges, defaulting to unknown range. Is there any user action item on these? |
@ColinPeppler will look into how to properly fix #127677 |
pytorch#127681) Purpose of this PR is to get around this error: pytorch#127677 Differential Revision: D58048558 Pull Request resolved: pytorch#127681 Approved by: https://github.com/lezcano
@ColinPeppler The code is quite large to isolate a minimal standalone to run. I have the full env in another ticket if you want. |
Could you try to isolate a smaller repro by putting breakpoints on this warnign and seeing the code that calls it @bhack and doing some manual binary search in the resulting fx graph? |
@lezcano A first isolation it is at #127677 (comment) |
try: | ||
vr = var_ranges[k] | ||
except KeyError: | ||
log.warning("%s is not in var_ranges, defaulting to unknown range.", k) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May I know why it is a warning here? Would it always warn on unbacked symbols? The cpp template uses some unbacked symbols to build IR nodes for codegen which would always get warning here.
{%- set tile_X = kernel.slice_nd(X, [("m_start", "m_end"), ("k_start", "k_end")]) %} |
@ColinPeppler this is annoying too many people. Can you send a patch that removes the warning, or send a fix altogether? |
sorry if I can't post a fix by end-of-today, I will remove the warning. |
Purpose of this PR is to get around this error: #127677
Differential Revision: D58048558
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @peterbell10 @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @amjames @desertfire @chauhang