New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add hard tanh layer #540
Comments
I'd like to take this up. Along with the |
You can directly implement the |
Okay, got it. I'll check it out and start right away. Thanks. |
Hello, I was trying to understand the code to add this function. I was also reading bias_layer.hpp. On line 156,157 I cannot understand(after extensive googling) what data::createNVP is doing, and what an NVP is? Could you help me with that? Will enable me to understand better |
Hi there Vinayak, The I hope this is helpful. Let me know if I can clarify anything. |
The hard tanh function is sometimes preferred over the tanh function since it is computationally cheaper. It does however saturate for magnitudes of x greater than 1. The activation of the hard tanh is:
It would be great, if mlpack would provide an implementation of a
HardTanHLayer
that allows the specification of the min and max value.The text was updated successfully, but these errors were encountered: