Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

implement important activation layer types #19

Closed
3 tasks done
MichaelHirn opened this issue Nov 6, 2015 · 2 comments
Closed
3 tasks done

implement important activation layer types #19

MichaelHirn opened this issue Nov 6, 2015 · 2 comments

Comments

@MichaelHirn
Copy link
Member

The following activation layers have a high priority:

  • Sigmoid
  • ReLU

Other activation layers that are interesting for implementation:

  • TanH
@hobofan
Copy link
Member

hobofan commented Feb 22, 2016

With #49 the ReLU layer has been added.

@MichaelHirn
Copy link
Member Author

ReLU, Sigmoid and TanH are now supported for CPU/Rust and GPU/CUDA via Collenchyma-NN. {GPU, CPU}/OpenCL should be tracked at the NN Plugin directly.

I believe this issue can be closed therefore. If necessary, I am happy to reopen it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants