Skip to content

Conversation

angelayi
Copy link
Contributor

Previously we would specialize on the shape in this if-statement

Copy link

pytorch-bot bot commented May 29, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/154656

Note: Links to docs will display an error until the docs builds have been completed.

❌ 4 New Failures, 2 Unrelated Failures

As of commit d4c0853 with merge base aa84c03 (image):

NEW FAILURES - The following jobs have failed:

FLAKY - The following job failed but was likely due to flakiness present on trunk:

UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link
Contributor

@pianpwk pianpwk May 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sym_and takes any number of arguments, so you can write this as a 3-way.

Also why are we expanding on equality, instead of inequality now? wouldn't it be guard_or_true(not sym_and(self.shape[0] == dim1, self.shape[1] == dim2, selfl.shape[2] == dim3))

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh um.. you're right

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh sorry I think I forgot the not... but guard_or_true didn't work for me

Copy link
Contributor

@laithsakka laithsakka May 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably you want sym_eq here,
this should work for your test case.

  if not guard_or_false(sym_eq(self.shape, (dim1, dim2, dim3))):
        self = self.expand((dim1, dim2, dim3))

is this what you want here?
i verified this work on your example. but not sure if that have other consequences?
does calling self.expand((dim1, dim2, dim3)) in theory works for when
self.shape == (dim1, dim2, dim3)?

what did not work with gaurd_or_true?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

those also should work

    if guard_or_true(sym_eq(self.shape, (dim1, dim2, dim3))==False):
        self = self.expand((dim1, dim2, dim3))

I think we need a sym_not , having not will specialize
cc @pianpwk

Copy link
Contributor

@laithsakka laithsakka May 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could be

def sym_not (a):
  return a==False

then you can do

    if guard_or_true(sym_not(sym_eq(self.shape, (dim1, dim2, dim3)))):
       self = self.expand((dim1, dim2, dim3))

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what did not work with gudrd_or_true?

I think originally doing guard_or_true(not sym_and(self.shape[0] == dim1, self.shape[1] == dim2, self.shape[2] == dim3)) didn't work because of the not in the middle of the expression. Your above examples all work!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

turns out there already exists torch.sym_not :D

@angelayi angelayi force-pushed the angelayi/baddbmm branch from 8e1dcf5 to 5975843 Compare May 29, 2025 21:20
@angelayi angelayi added the topic: not user facing topic category label May 29, 2025
Copy link
Contributor

@laithsakka laithsakka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

left some comments.

@angelayi angelayi force-pushed the angelayi/baddbmm branch from 859fc59 to d4c0853 Compare June 2, 2025 07:01
@angelayi angelayi added the ciflow/trunk Trigger trunk jobs on your pull request label Jun 2, 2025
@angelayi
Copy link
Contributor Author

angelayi commented Jun 2, 2025

@pytorchbot merge -f "can repro failures on main"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

iupaikov-amd pushed a commit to ROCm/pytorch that referenced this pull request Jun 4, 2025
Previously we would specialize on the shape in this if-statement
Pull Request resolved: pytorch#154656
Approved by: https://github.com/pianpwk
@github-actions github-actions bot deleted the angelayi/baddbmm branch July 4, 2025 02:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request Merged topic: not user facing topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants