Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Ivy Failing Test: paddle - activations.relu for all backend #28233

Merged
merged 6 commits into from
Feb 20, 2024

Conversation

marvlyngkhoi
Copy link
Contributor

@marvlyngkhoi marvlyngkhoi commented Feb 9, 2024

PR Description

500 test cases passed locally for all backend
Screenshot from 2024-02-09 18-10-42

Related Issue

Closes #28232

Checklist

  • Did you add a function?
  • Did you add the tests?
  • Did you run your tests and are your tests passing?
  • Did pre-commit not fail on any check?
  • Did you follow the steps we provided?

Copy link
Contributor

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Compliance Checks

Thank you for your Pull Request! We have run several checks on this pull request in order to make sure it's suitable for merging into this project. The results are listed in the following section.

Issue Reference

In order to be considered for merging, the pull request description must refer to a specific issue number. This is described in our contributing guide and our PR template.
This check is looking for a phrase similar to: "Fixes #XYZ" or "Resolves #XYZ" where XYZ is the issue number that this PR is meant to address.

Copy link
Contributor

@samthakur587 samthakur587 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handle the complex dtype at paddle backend.

@samthakur587 samthakur587 self-assigned this Feb 9, 2024
@marvlyngkhoi
Copy link
Contributor Author

marvlyngkhoi commented Feb 10, 2024

Did as you've asked and tested again found out that the error for complex numbers happen due to cases where complex_mode=st.sampled_from(["jax","split","magnitude"]), was set to jax at the main ivy test_activation.py file so should I add a check at the paddle backend or should i just removed jax from the list of stategy to sample from @samthakur587

@samthakur587
Copy link
Contributor

Did as you've asked and tested again found out that the error for complex numbers happen due to cases where complex_mode=st.sampled_from(["jax","split","magnitude"]), was set to jax at the main ivy test_activation.py file so should I add a check at the paddle backend or should i just removed jax from the list of stategy to sample from @samthakur587

No we should not remove the jax from complex_mode=st.sampled_from(["jax","split","magnitude"]) according to the _handle_complex_input handling the complex data. if it's 'jax' then it's calling the function based on the backend like here the paddle and call the relu impl. for complex dtype. as i have seen the we don't have the correct implementation for complex dtype at paddle backend. so we should try to fix this. @fleventy-5

@marvlyngkhoi
Copy link
Contributor Author

@samthakur587 did the fix and found out the root cause of the problem complex128 dtype which is not supported in paddle I think
Pass 500 cases tested locally for paddle backend
Screenshot from 2024-02-10 21-24-04
Also tested 200 cases for all backend just to be sure that the fix didn't break other backend
Screenshot from 2024-02-10 21-31-16

@samthakur587
Copy link
Contributor

Hii @fleventy-5 can you please solve the merge conflict to run the CI.

@samthakur587 samthakur587 changed the title Fix Ivy Failing Test: paddle - activations.relu for all backend fix: Ivy Failing Test: paddle - activations.relu for all backend Feb 10, 2024
@marvlyngkhoi
Copy link
Contributor Author

Hii @fleventy-5 can you please solve the merge conflict to run the CI.

@samthakur587 done

def relu(x: paddle.Tensor, /, *, out: Optional[paddle.Tensor] = None) -> paddle.Tensor:
def relu(
x: paddle.Tensor, /, *, complex_mode="jax", out: Optional[paddle.Tensor] = None
) -> paddle.Tensor:
if paddle.is_complex(x):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

import ivy
import paddle
import jax
x = paddle.to_tensor([1-1j, 1+1j])
y = jax.numpy.array([1-1j, 1+1j])
#paddle output: ivy.array([1.+0.j, 1.+1.j])
print(ivy.relu(x))
#jax output: ivy.array([1.-1.j, 1.+1.j])
print(ivy.relu(y))

as the logic for the complex dtype at paddle is return paddle.complex(F.relu(x.real()), F.relu(x.imag())) it's dosen't pass this case bcz the x.imag() part is -ve and if we apply the relu activation function then it returns the zero that is the wrong logic here for paddle backend for complex dtype.

Copy link
Contributor

@samthakur587 samthakur587 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hii @fleventy-5 sorry for the late reply. i have added the comment what changes are required.

@marvlyngkhoi
Copy link
Contributor Author

marvlyngkhoi commented Feb 18, 2024

@samthakur587 done the implementation at the paddle back-end in the below code

def relu(
    x: paddle.Tensor, /, *, complex_mode="jax", out: Optional[paddle.Tensor] = None
) -> paddle.Tensor:
    x_dtype = str(x.dtype).split('.')[1]
    if paddle.is_complex(x):
        if complex_mode=="jax":
            x_real = paddle.real(x)
            x_imag = paddle.imag(x)
    
            condition = paddle.logical_or(
               paddle.logical_and(x_real <= 0, x_imag < 0)
               , x_real < 0)
            
            x_imag = paddle.where(condition, paddle.zeros_like(x_imag), x_imag)  
            return paddle.complex(F.relu(x_real),x_imag)
        
        return paddle.complex(F.relu(x.real()), F.relu(x.imag()))
    return F.relu(x)

but there is still issue with complex128
Screenshot from 2024-02-18 17-29-21
and when I tried the function all by itself on isolation using the above function definition it was working fine as shown below
Screenshot from 2024-02-18 17-26-48
but it was working fine for complex64

Copy link
Contributor

@Ishticode Ishticode left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @fleventy-5
Thanks for the PR. Merging this now.
@samthakur587 Could you fix the value mismatches for some of the complex vlue cases that you outlined in paddle :)
Thank you both

@Ishticode Ishticode merged commit 8a45528 into ivy-llc:main Feb 20, 2024
129 of 141 checks passed
Kacper-W-Kozdon pushed a commit to Kacper-W-Kozdon/ivy that referenced this pull request Feb 22, 2024
@marvlyngkhoi marvlyngkhoi deleted the fix_ivy_activation_relu branch March 1, 2024 12:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fix Ivy Failing Test: paddle - activations.relu
5 participants