Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clarity of error message in einsum regressed in performance improvements #58380

Open
t-vi opened this issue May 17, 2021 · 2 comments
Open

Clarity of error message in einsum regressed in performance improvements #58380

t-vi opened this issue May 17, 2021 · 2 comments
Labels
module: error checking Bugs related to incorrect/lacking error checking module: linear algebra Issues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmul triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@t-vi
Copy link
Collaborator

t-vi commented May 17, 2021

🐛 Bug

From Sohrab Samimi on the forums:

first = torch.rand(12,8192,2)
weights1 = torch.rand(12,8192,2)
torch.einsum('bix,iox->box',first,weights1)

gives

einsum() operands do not broadcast with remapped shapes [original->remapped]: [12, 8192, 2]->[12, 1, 2, 8192] [12, 8192, 2]->[1, 8192, 2, 12]

To which they say:

Can someone explain to me what I am doing wrong?

Clearly, the i-dimensions don't match, but the error message sounds relatively opaque to me.

Expected behavior

In my opinion, it would be better to print the letter/dimension and term number where the size mismatch happens. I think the old version did this, at least sometimes...

@heitorschueroff

cc @jianyuh @nikitaved @pearu @mruberry @heitorschueroff @walterddr @IvanYashchuk @xwang233 @lezcano

@heitorschueroff
Copy link
Contributor

@t-vi I agree that this error message can be improved. I went with a similar error message as NumPy but now looking back it's not easy to understand what is going on. I'll work on an improvement as part of the changes to einsum I'm doing. Thanks!

@heitorschueroff heitorschueroff self-assigned this May 17, 2021
@heitorschueroff heitorschueroff added module: error checking Bugs related to incorrect/lacking error checking module: linear algebra Issues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmul triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels May 17, 2021
@t-vi
Copy link
Collaborator Author

t-vi commented May 17, 2021

@heitorschueroff Thanks for looking at this!

I really love your perf work on einsum, too!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: error checking Bugs related to incorrect/lacking error checking module: linear algebra Issues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmul triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants