Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing the softmax activation function #3

Closed
Kerollmops opened this issue Dec 21, 2021 · 1 comment
Closed

Implementing the softmax activation function #3

Kerollmops opened this issue Dec 21, 2021 · 1 comment

Comments

@Kerollmops
Copy link

I saw that I will need the softmax activation function for my basic word2vec neural network. The thing is the softmax activation function needs the whole list of output values and the Activation trait does only know about the current output value. I would say that I must change the trait to give the whole list of values somehow but I find it not performant enough if I continue to call it on each output value, computing the exponential sum every-time.

@c0dearm
Copy link
Owner

c0dearm commented May 24, 2022

Hey! Sorry for the late response.

I've just released a new version of the library which basically changes it completely so probably the issue is not relevant anymore.

Adding new operations/layers should be easy peasy now.

@c0dearm c0dearm closed this as completed May 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants