-
Notifications
You must be signed in to change notification settings - Fork 409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix issue with any/all when all reduce dimensions of input have size 1 #2590
Conversation
a30bef1
to
0e41006
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with using xla::Zeros()
0e41006
to
ef3353b
Compare
Can not repo the TestNNDeviceTypeXLA.test_GroupNorm_empty_xla failure, will take another look if it failed again. |
Test failure might be due to the pinned pytorch_pr needs a rebase. I manually applied the patch on the pytorch head and run all test with this pr and all test passed. |
ef3353b
to
ef28459
Compare
Use the |
All test passed locally |
ef28459
to
77a4853
Compare
This is to fix #2585, in pytorch pr pytorch/pytorch#44790,
any
andall
can take non-boolean input. With our current lowering,reduce
won't do anything if all reduce dimensions of the input all have size 1.Add a
xla::Select
when all reduce dimensions have size 1 to force the result value to be1
or0