-
Notifications
You must be signed in to change notification settings - Fork 25.6k
make torch._check understand Eq commutativity #125629
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/125629
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit 1ee442d with merge base affd7a9 ( BROKEN TRUNK - The following job failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D57014730 |
3ebc3b8
to
a7dced0
Compare
Summary: Given `torch._check(a == b)` we can still get a data-dependent error needing `b == a`. Simple fix. ``` def forward(self, x1, x2, x3, y): z1 = x1.item() z2 = x2.item() z3 = x3.item() torch._check((z2 + z3) == z1) # torch._check(z1 == (z2 + z3)) didn't work, now does if z2 + z3 == z1: return y * 2 else: return y + 3 ``` Test Plan: none Differential Revision: D57014730
This pull request was exported from Phabricator. Differential Revision: D57014730 |
add_expr(e) | ||
# Other relational expressions this expression implies | ||
if isinstance(e, sympy.Eq): | ||
add_expr(sympy.Eq(e.rhs, e.lhs)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw, also noticed that the dualizing above didn't have an effect, I think canonicalizing after dualizing cancels out. maybe something to follow up in the future
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for the code above, we get something like Eq(u0 - u1 - u2, 0)
to evaluate statically with substitutions like Eq(u1 + u2 - u0, 0)
, which didn't pattern match.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ugh, right. To kill two birds with one stone, what you can do is, in the case of Eq, take the dual of a = b
as -a = -b
. The canonicalisation of this expression would give you b - a = 0
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test please.
@lezcano do you want this as is, or would you prefer we land the two birds directly?
Probably the proposed change, as it's also one line |
Summary: Pull Request resolved: pytorch#125629 Given `torch._check(a == b)` we can still get a data-dependent error needing `b == a`. Simple fix. ``` def forward(self, x1, x2, x3, y): z1 = x1.item() z2 = x2.item() z3 = x3.item() torch._check((z2 + z3) == z1) # torch._check(z1 == (z2 + z3)) didn't work, now does if z2 + z3 == z1: return y * 2 else: return y + 3 ``` Test Plan: added test Reviewed By: ezyang Differential Revision: D57014730
This pull request was exported from Phabricator. Differential Revision: D57014730 |
a7dced0
to
1ee442d
Compare
@pytorchbot merge -f 'Landed internally' (Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally) |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary:
Given
torch._check(a == b)
we can still get a data-dependent error needingb == a
. Simple fix.Differential Revision: D57014730