Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ELU, PReLU, ThresholdedReLU layers #32

Merged
merged 4 commits into from
Jul 10, 2017
Merged

Conversation

plavin
Copy link
Contributor

@plavin plavin commented Jul 10, 2017

This is based off of the Keras documentation here The PReLU layer in Keras supports more features that we can discuss adding in the future. (e.g. weight initialization function, regularization).

{
auto tmp = max(input, 0.0);
auto res = expandAs(m_parameters[0],tmp) * tmp;
//TODO: Determine if doing the max after the mul is preferable
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be two different behaviors. max after multiply will have a different behavior when values of input and m_parameters are both negative. Can you double check what the exact behavior should be.

{
auto mask = input > m_threshold;
return input * mask;
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should probably just be renamed to Threshold ?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe Threshold may be misleading, since threshold usually denotes a function R -> {0,1} (e.g., return input > m_threshold;). ThresholdedReLU or ThresholdReLU seem a better fit to me.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ThresholdReLU sounds better. Thanks @FloopCZ

};
return Variable(result, {lhs, rhs, mask}, grad_func);
}

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function should be indented by four spaces.

auto res = max(input, m_alpha * (exp(input) - 1));
return res;
auto mask = input >= 0.0;
return (mask * input) + (!mask * m_alpha * exp(input));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be (!mask * m_alpha * (exp(input) - 1)) ?
https://keras.io/layers/advanced-activations/#elu

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dangit...

@pavanky pavanky merged commit 65efde8 into arrayfire:master Jul 10, 2017
@pavanky pavanky mentioned this pull request Jul 10, 2017
20 tasks
@pavanky pavanky modified the milestone: 0.1 Jul 11, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants