-
-
Notifications
You must be signed in to change notification settings - Fork 430
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calculate dot product 2 times in different directions #404
Comments
I think I found the cause of the bug. I will update this issue with a fix PR and/or further problems I encounter. |
This was referenced May 31, 2020
This works in latest build after this issue gorgonia/tensor#73 is fixed |
OK closing. Thanks all |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hey people. 👋
I'm trying to calculate the dot product of a matrix + vector. I have to do this 2 times in different directions (so the matrix is used as input for 2 different operations in the graph). Everything is batched and differentiation is needed. I've tried different solutions using
BatchedMatMul
andBroadcast
->HadamardProd
->Sum
but I'm running into difficulties with each of them.Here's an example using BatchedMatMul:
If I do the
BatchedMatMul
s separately it works. I know Gorgonia does a lot of in-place magic but I don't know how to get around it.In my own project this somehow works but then the differentiation doesn't like adding the gradients of 2 different shapes.
I'll add code to calculate the gradients when I can get past this error.
The text was updated successfully, but these errors were encountered: