Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

softmax_backward missing #472

Open
christopherzimmerman opened this issue Oct 28, 2020 · 1 comment
Open

softmax_backward missing #472

christopherzimmerman opened this issue Oct 28, 2020 · 1 comment

Comments

@christopherzimmerman
Copy link
Contributor

For Tensors stored on a CPU, there seems to be no implementation of softmax_backward as referenced here

I looked through the source when that file was added, and didn't find it in the code base then either. I am not that familiar with nim, so wanted to make sure I'm not missing an implementation somewhere before I worked on a PR.

@mratsim
Copy link
Owner

mratsim commented Oct 28, 2020

Seems like I missed adding this.

Usually you need fused Softmax and Negative log-likelihood which is implemented as softmax cross entropy here: https://github.com/mratsim/Arraymancer/blob/master/src/arraymancer/nn_primitives/nnp_softmax_cross_entropy.nim so feel free to implement softmax.

You can use this test as example on how to use numerical gradient for testing https://github.com/mratsim/Arraymancer/blob/master/tests/nn_primitives/test_nnp_loss.nim

Once nn_primitives are implemented and tested, I don't require testing for the higher level neural net for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants