Skip to content

Conversation

avikchaudhuri
Copy link
Contributor

Summary:
Given torch._check(a == b) we can still get a data-dependent error needing b == a. Simple fix.

def forward(self, x1, x2, x3, y):
    z1 = x1.item()
    z2 = x2.item()
    z3 = x3.item()
    torch._check((z2 + z3) == z1)
    # torch._check(z1 == (z2 + z3)) didn't work, now does
    if z2 + z3 == z1:
        return y * 2
    else:
        return y + 3

Differential Revision: D57014730

Copy link

pytorch-bot bot commented May 6, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/125629

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 1ee442d with merge base affd7a9 (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added ciflow/inductor release notes: fx release notes category labels May 6, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57014730

@avikchaudhuri avikchaudhuri requested review from ezyang and lezcano May 6, 2024 20:26
avikchaudhuri added a commit to avikchaudhuri/pytorch that referenced this pull request May 6, 2024
Summary:

Given `torch._check(a == b)` we can still get a data-dependent error needing `b == a`. Simple fix.

```
def forward(self, x1, x2, x3, y):
    z1 = x1.item()
    z2 = x2.item()
    z3 = x3.item()
    torch._check((z2 + z3) == z1)
    # torch._check(z1 == (z2 + z3)) didn't work, now does
    if z2 + z3 == z1:
        return y * 2
    else:
        return y + 3
```

Test Plan: none

Differential Revision: D57014730
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57014730

add_expr(e)
# Other relational expressions this expression implies
if isinstance(e, sympy.Eq):
add_expr(sympy.Eq(e.rhs, e.lhs))
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

btw, also noticed that the dualizing above didn't have an effect, I think canonicalizing after dualizing cancels out. maybe something to follow up in the future

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for the code above, we get something like Eq(u0 - u1 - u2, 0) to evaluate statically with substitutions like Eq(u1 + u2 - u0, 0), which didn't pattern match.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ugh, right. To kill two birds with one stone, what you can do is, in the case of Eq, take the dual of a = b as -a = -b. The canonicalisation of this expression would give you b - a = 0.

Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test please.

@lezcano do you want this as is, or would you prefer we land the two birds directly?

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label May 8, 2024
@lezcano
Copy link
Collaborator

lezcano commented May 8, 2024

Probably the proposed change, as it's also one line

Summary:
Pull Request resolved: pytorch#125629

Given `torch._check(a == b)` we can still get a data-dependent error needing `b == a`. Simple fix.

```
def forward(self, x1, x2, x3, y):
    z1 = x1.item()
    z2 = x2.item()
    z3 = x3.item()
    torch._check((z2 + z3) == z1)
    # torch._check(z1 == (z2 + z3)) didn't work, now does
    if z2 + z3 == z1:
        return y * 2
    else:
        return y + 3
```

Test Plan: added test

Reviewed By: ezyang

Differential Revision: D57014730
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57014730

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge -f 'Landed internally'

(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request fb-exported Merged release notes: fx release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants