Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Neural Tanh Layer and corresponding unit test #3038

Closed
wants to merge 1 commit into from

Conversation

arasuarun
Copy link

The Tanh squasher is a very popular alternative to the sigmoid layer. It's widely used, although not as much as before thanks to relu's. It's definitely important for a complete neural nets package. :)

{
float64_t sum_j = 0;
for (int32_t j=0; j<num_inputs; j++)
sum_j += W(i,j)*W(i,j);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's not 80s anymore. please use appropriate SIMD operations to do these things... or just use eigen.

@vigsterkr
Copy link
Member

this is a vanilla implementation. please consider using our linalg library and openmp at the least.

@arasuarun
Copy link
Author

@vigsterkr I was wondering why you guys didn't use the linalg library for the logistic layer so I just stuck with the same implementation. I'll modify this one to use it now. Thanks.

namespace shogun
{
/** @brief Neural layer with linear neurons, with a [tanh activation
* function](http://en.wikipedia.org/wiki/Hyperbolic_function). can be used as a
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can be used .... Capital C

Can you also put a reference to a paper here, where there is some evidence that this is useful? Thanks

@karlnapf
Copy link
Member

karlnapf commented Mar 5, 2016

Nice patch.
I agree with @vigsterkr please use linalg, eigen, and openmp to make this stuff fast

@karlnapf
Copy link
Member

news?

@arasuarun
Copy link
Author

@karlnapf if you are okay with it, I think you can merge it now. When refactoring the nn package with linalg and openmp, I think it'd be better if all the modules are refactored at once. What do you think?
(this also concerns #3037)

@karlnapf
Copy link
Member

Few things that block this being merged.

  • travis is failing, your code does not compile
  • Shogun's NN should be refactored 1by1, not all at once (small PRs!)
  • Put all the open mp stuff already in. Once something is merged, it usually is unlikely that it is touched again, so we want to make it as good as we can in the first merge

@vigsterkr vigsterkr closed this Feb 28, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants