-
Notifications
You must be signed in to change notification settings - Fork 24.9k
bitwise_and: Port to structured #60813
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 8817a0e (more details on the Dr. CI page and at hud.pytorch.org/pr/60813):
🕵️ 2 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
[ghstack-poisoned]
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
This one gonna need XLA update |
@ezyang This pr seems to remove the constrain of
for bitwise and, which xla happens to have a test on in https://github.com/pytorch/xla/blob/master/test/test_operations.py#L1208. Is this intentional? Should pt/xla follow the same guidance and remove the constrain of this kind of type promotion? |
No, I'm guessing this is unintentional, and not exercised in any of PyTorch's native tests. Thanks for identifying the issue. |
Oof but I don't understand how this PR could cause the behavior change :( |
OK actually this is pretty weird.
This seems wrong. The reason this happens is because of how the output used to be allocated in the functional case:
aka bitwise_and always takes on the type of its first argument, but then casting is not disabled. numpy is willing to promote both ways
so actually seems like the new behavior is correct. @Freey0 as a follow up we should add tests for this bug. |
Ok, I will add a test for this bug in the new PR |
@JackCaoG what do you think of this analysis? |
@ezyang I think it is fair. Original comment on xla test(by Ailing) also mentioned that this behavior is not aligned with numpy and might change in the future. I guess now is the future 😄 . I will update xla to remove this constrain. |
@ezyang I think I fixed the XLAOP casting part to allow this type promotion but I had an check that I borrowed from pytorch that failed
when I do I think this is part of the functional case you mentioned because How does structured binary op performs type checking for input and output? Does pt/xla need to care about the type checking in this case(not sure if it is always done by pytorch structuredop )? |
For now, still yes, because you are writing regular XLA kernels and not structured XLA kernels (that was along the lines of what @bdhirsh was working on)
It seems like the meta function seems to imply it's normal type promotion:
If you implement this the same way, e.g., add, is implemented, does that give you the correct behavior? |
Differential Revision: [D29449374](https://our.internmc.facebook.com/intern/diff/D29449374) [ghstack-poisoned]
Also add type promotion test for bugs found by pr #60813 [ghstack-poisoned]
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Differential Revision: [D29449374](https://our.internmc.facebook.com/intern/diff/D29449374) [ghstack-poisoned]
Also add type promotion test for bugs found by pr #60813 Differential Revision: [D29592840](https://our.internmc.facebook.com/intern/diff/D29592840) [ghstack-poisoned]
Also add type promotion test for bugs found by pr #60813 Differential Revision: [D29592840](https://our.internmc.facebook.com/intern/diff/D29592840) [ghstack-poisoned]
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Differential Revision: [D29449374](https://our.internmc.facebook.com/intern/diff/D29449374) [ghstack-poisoned]
Also add type promotion test for bugs found by pr #60813 Differential Revision: [D29592840](https://our.internmc.facebook.com/intern/diff/D29592840) [ghstack-poisoned]
Also add type promotion test for bugs found by pr #60813 Differential Revision: [D29592840](https://our.internmc.facebook.com/intern/diff/D29592840) [ghstack-poisoned]
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Stack from ghstack:
Differential Revision: D29449374