-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added Neural Tanh Layer and corresponding unit test #3038
Conversation
{ | ||
float64_t sum_j = 0; | ||
for (int32_t j=0; j<num_inputs; j++) | ||
sum_j += W(i,j)*W(i,j); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's not 80s anymore. please use appropriate SIMD operations to do these things... or just use eigen.
this is a vanilla implementation. please consider using our linalg library and openmp at the least. |
@vigsterkr I was wondering why you guys didn't use the linalg library for the logistic layer so I just stuck with the same implementation. I'll modify this one to use it now. Thanks. |
namespace shogun | ||
{ | ||
/** @brief Neural layer with linear neurons, with a [tanh activation | ||
* function](http://en.wikipedia.org/wiki/Hyperbolic_function). can be used as a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can be used .... Capital C
Can you also put a reference to a paper here, where there is some evidence that this is useful? Thanks
Nice patch. |
news? |
Few things that block this being merged.
|
The Tanh squasher is a very popular alternative to the sigmoid layer. It's widely used, although not as much as before thanks to relu's. It's definitely important for a complete neural nets package. :)