Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LogSoftmax and disable fp16 Softmax #438

Merged
merged 3 commits into from May 22, 2019

Conversation

Projects
None yet
2 participants
@TE-TakuyaNarihira
Copy link
Contributor

commented May 22, 2019

The PR sony/nnabla-ext-cuda#155 adds a CUDA implementation.

This adds:

  • LogSoftmax operation which computes log followed by softmax more stable.
  • Disabled fp16 softmax for mixed precision training.
  • Use LogSoftmax in SoftmaxCrossEntropy for more stability.

Depends on #437.

@TE-TakuyaNarihira TE-TakuyaNarihira force-pushed the feature/20190522-logsoftmax-and-fp32-softmax branch from 4a3b718 to 9494597 May 22, 2019

@TE-AkioHayakawa TE-AkioHayakawa merged commit 18b9e1d into master May 22, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.