-
Notifications
You must be signed in to change notification settings - Fork 25.2k
matrix_exp
: Make sure _compute_linear_combinations
result preserves dim of the input.
#81330
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful links
✅ No Failures (0 Pending)As of commit 036e1ab (more details on the Dr. CI page): Expand to see more💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was the pesky squeeze
that was missing. Thank you @nikitaved!
Could you add a short test to make sure that we're fixing the issue?
The cases that trigger this warning are already in |
Yup, could you add one of those wrappers that make the test fail if a warning is thrown around those operations? |
I seem to fail to capture this warning with OpInfo, for example:
This is done with |
What I was suggesting was to add a short test in |
6aefb35
to
49bbc36
Compare
49bbc36
to
036e1ab
Compare
with freeze_rng_state(): | ||
torch.manual_seed(42) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure this is strictly necessary, but decided to play it extra safe.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good stuff. Thank you Nik!
@pytorchbot merge -g |
@pytorchbot successfully started a merge job. Check the current status here |
Merge failed due to Matched rule superuser, but PR #81330 was not reviewed yet by any of: nimin98, fbbradheintz, kavoor, lc0, shz117, ... |
No one can say I didn't try. cc @mruberry |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Stamped!
@pytorchbot merge -g |
@pytorchbot successfully started a merge job. Check the current status here |
Hey @nikitaved. |
I think the problem here is how we parse merge rules. I.e. following line pytorch/.github/merge_rules.json Line 102 in 31142f5
would match any LinearAlgebra file in native/ subfolder, but not LinearAlgebra in the folder itself
|
…es dim of the input. (#81330) (#81330) Summary: Fixes #80948. Pull Request resolved: #81330 Approved by: https://github.com/Lezcano, https://github.com/mruberry Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/85144e63a92b4a7df0c49bfc5797f24c4eeb831c Reviewed By: DanilBaibak Differential Revision: D37813332 Pulled By: DanilBaibak fbshipit-source-id: f9bccbbd57fb7eea586ce3f564023879e16c29d6
#81330 (comment) shouldn't have happened and should have matched the linalg rule. The reason it happened is cuz we don't treat the patterns like globs. We may follow this up with a BE change to make the patterns be treated more like globs, which I expected originally. Pull Request resolved: #81414 Approved by: https://github.com/malfet
…81414) Summary: #81330 (comment) shouldn't have happened and should have matched the linalg rule. The reason it happened is cuz we don't treat the patterns like globs. We may follow this up with a BE change to make the patterns be treated more like globs, which I expected originally. Pull Request resolved: #81414 Approved by: https://github.com/malfet Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/c67a4d8a65ade60eff0f394c74c33b55d9419741 Reviewed By: jeanschmidt Differential Revision: D37990686 Pulled By: janeyx99 fbshipit-source-id: 89cd472c2b3bc352937906bbd27d883e0317d290
Fixes #80948.