-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Reduce operations give partly wrong results with 3D tensors. #137
Comments
BTW, it works great with 1D and 2D tensors :) |
thanks for the report @ZJUGuoShuai. we recently moved most reductions to cub, so I will check for a regression |
Hi @ZJUGuoShuai, it looks like the order isn't quite what we expect, and we should have errored. The way our reductions work is the output tensor must have the same inner ranks as the input tensor. In other words,
By going from a rank 3 to a rank one you are asking it to sum over the outer 2 dimensions, and the final output would have two values that are the sum of the two 3x2 matrices:
It looks like instead you want to just sum over 6 rows of 2 columns each, which would be:
I also had to make a small change since we did have a regression when using CUB for row-wise reductions, so that's back to take a slower path for now and will be fixed next week. You should be able to get the correct answer now with the latest code and the example above. |
@cliffburdick Thanks, now I know how to use Reduce operations:
|
Describe the bug
I tried to use
matx::sum
on 3D tensors, which are common in Deep Learning (batch_size, sequence_len, embedding_size
). The results are partly wrong. I don't know if it's my mis-using or it's not supported yet.To Reproduce
Steps to reproduce the behavior:
mat
with shape{2, 3, 2}
:Expected behavior
The Results should be:
System details (please complete the following information):
The text was updated successfully, but these errors were encountered: