Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PReLU Activation #247

Closed
enver1323 opened this issue Dec 19, 2022 · 3 comments
Closed

PReLU Activation #247

enver1323 opened this issue Dec 19, 2022 · 3 comments
Labels
feature New feature

Comments

@enver1323
Copy link
Contributor

Hi @patrick-kidger, I am a big fan of Equinox, thank you for publishing this framework. I am currently working on a project and decided to give a PReLU a try, Therefore, I was wondering, if it was possible to add PReLU Activation function to a some module like nn.activations.

My current implementation, is as follows:

class PReLU(eqx.Module):
    negative_slope: jaxtyping.Array

    def __init__(self, alpha):
        self.negative_slope = jnp.array((alpha,))

    def __call__(self, x):
        return jax.numpy.where(x >= 0, x, self.negative_slope * x)
@patrick-kidger
Copy link
Owner

Your implementation looks good to me. I'd be happy to accept a PR adding this to Equinox.

@enver1323
Copy link
Contributor Author

Thank you, @patrick-kidger! I added the PR with the implementation and docs in the PR #249 .

@patrick-kidger
Copy link
Owner

Closing since this now exists on the dev branch, which will shortly be merged to main.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature
Projects
None yet
Development

No branches or pull requests

2 participants