Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong derivative of leaky-relu activation #15

Open
sjh11556 opened this issue Apr 5, 2021 · 1 comment
Open

Wrong derivative of leaky-relu activation #15

sjh11556 opened this issue Apr 5, 2021 · 1 comment
Assignees

Comments

@sjh11556
Copy link

sjh11556 commented Apr 5, 2021

According to mod_activation.F90 file, It seems like derivative function of leaky_relu is same as RELU.

  pure function leaky_relu_prime(x, alpha) result(res)
    ! First derivative of the REctified Linear Unit (RELU) activation function.
    real(rk), intent(in) :: x(:)
    real(rk), intent(in) :: alpha
    real(rk) :: res(size(x))
    where (0.3 * x > 0)
      res = 1
    elsewhere
      res = 0
    end where
  end function leaky_relu_prime
@jordanott jordanott self-assigned this Apr 16, 2021
@milancurcic
Copy link

FWIW neural-fortran-0.12.0 implements leaky ReLU.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants