-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[NestedTensor] Call contiguous in linear backward #94317
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/94317
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit f883b80: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
nt = torch.nested.as_nested_tensor([a, b, c]) | ||
# This implicitly tests to_padded_tensor grads | ||
d = torch.functional.F.linear(nt, weight, bias) | ||
d = d.transpose(-1, -2).contiguous() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are we calling contiguous here :b
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can't call to_padded_tensor without it being contiguous but the upward_grad for linear will have been transposed and then that will exercise the new code
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Fixes #94303
If in upward grad for linear_backward was discontiguous we would throw a torch check. This updates the implementation to instead call contiguous and changes the check to an internal assert.
cc @cpuhrsch @jbschlosser @bhosmer @mikaylagawarecki