-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Ivy Failing Test: paddle - activations.relu for all backend #28233
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Compliance Checks
Thank you for your Pull Request! We have run several checks on this pull request in order to make sure it's suitable for merging into this project. The results are listed in the following section.
Issue Reference
In order to be considered for merging, the pull request description must refer to a specific issue number. This is described in our contributing guide and our PR template.
This check is looking for a phrase similar to: "Fixes #XYZ" or "Resolves #XYZ" where XYZ is the issue number that this PR is meant to address.
4bb1c16
to
418d6e1
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
handle the complex dtype at paddle backend.
Did as you've asked and tested again found out that the error for complex numbers happen due to cases where |
No we should not remove the jax from |
@samthakur587 did the fix and found out the root cause of the problem complex128 dtype which is not supported in paddle I think |
Hii @fleventy-5 can you please solve the merge conflict to run the CI. |
@samthakur587 done |
def relu(x: paddle.Tensor, /, *, out: Optional[paddle.Tensor] = None) -> paddle.Tensor: | ||
def relu( | ||
x: paddle.Tensor, /, *, complex_mode="jax", out: Optional[paddle.Tensor] = None | ||
) -> paddle.Tensor: | ||
if paddle.is_complex(x): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import ivy
import paddle
import jax
x = paddle.to_tensor([1-1j, 1+1j])
y = jax.numpy.array([1-1j, 1+1j])
#paddle output: ivy.array([1.+0.j, 1.+1.j])
print(ivy.relu(x))
#jax output: ivy.array([1.-1.j, 1.+1.j])
print(ivy.relu(y))
as the logic for the complex dtype at paddle is return paddle.complex(F.relu(x.real()), F.relu(x.imag()))
it's dosen't pass this case bcz the x.imag() part is -ve and if we apply the relu
activation function then it returns the zero
that is the wrong logic here for paddle backend for complex dtype.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hii @fleventy-5 sorry for the late reply. i have added the comment what changes are required.
@samthakur587 done the implementation at the paddle back-end in the below code
but there is still issue with complex128 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @fleventy-5
Thanks for the PR. Merging this now.
@samthakur587 Could you fix the value mismatches for some of the complex vlue cases that you outlined in paddle :)
Thank you both
PR Description
500 test cases passed locally for all backend
Related Issue
Closes #28232
Checklist