-
-
Notifications
You must be signed in to change notification settings - Fork 431
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Softmax op #440
Softmax op #440
Conversation
It's still missing the methods to allow to back propagate it
op_softmax.go
Outdated
sm.Reshape(inputTensor.Shape().TotalSize(), 1) | ||
|
||
smT := sm.Clone().(tensor.Tensor) | ||
smT.Transpose() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can use smT.T()
, which does no work, rather than Transpose
which does a lot of work
op_softmax.go
Outdated
smT := sm.Clone().(tensor.Tensor) | ||
smT.Transpose() | ||
|
||
smDot, err := tensor.Dot(sm, smT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use MatMul
. tensor.Dot
is a bad habit
And handle the errors properly
And add debugging statements
Co-authored-by: David Cuadrado <73729+dcu@users.noreply.github.com>
Updated softmax test to also check for values Co-authored-by: David Cuadrado <73729+dcu@users.noreply.github.com>
Co-authored-by: David Cuadrado <73729+dcu@users.noreply.github.com>
Phew. This LGTM. Once it passes tests, you can merge. Most excellent work @dcu |
No description provided.