Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for CuDNN Softmax forward and backward ops #51

Open
Henry-Chinner opened this issue Apr 16, 2016 · 0 comments
Open

Support for CuDNN Softmax forward and backward ops #51

Henry-Chinner opened this issue Apr 16, 2016 · 0 comments

Comments

@Henry-Chinner
Copy link

Implementing the Softmax's forward function is straightforward in CudArray but the backprop suffers performance wise as it requires multiple kernel launches. The resulting formula for the backprop also tends to be numerically unstable. Haveing the softmax's forward and backward available in the cudnn module would be a massive help for neural networks where the softmax is extensively used, especially in the case where the loss function attached to the softmax in not the cross entropy loss ( which avoids the calculation of the jacobian )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant