New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make torch.svd return V, not V.conj() for complex inputs #51012
Conversation
This link points to the current documentation. |
💊 CI failures summary and remediationsAs of commit 7fc28c5 (more details on the Dr. CI page):
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
XLA's |
Continuing to use at::svd to implement pinverse (for now) sounds good. |
Just so I'm confident I understand what's going on here:
Is that correct? Follow-up question: how long has torch.svd been returning V and not V^H for complex inputs? |
LAPACK and MAGMA return V.transpose() for real inputs and V.conj().transpose() for complex inputs.
torch.svd had been returning V.conj(), not V since the 3rd of October. Introduced in this PR #45795 |
Got it, thank you for clarifying. This all makes sense now. Looks like this support is part of PyTorch 1.7.1, so this will be a BC-breaking change. Worse, it's a silent correctness issue. I expanded the PR summary with a BC-breaking note reflecting the current state of this PR (please correct me if I'm still missing something). It would be safer if we could disallow complex inputs to torch.svd, even if internally we rely on at::svd with complex inputs. I can't think of any reasonable way to do that, however. Luckily torch.svd is being deprecated and this functionality was never documented, so it may be OK to change this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the cleanup @IvanYashchuk !
The PR looks good to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Codecov Report
@@ Coverage Diff @@
## master #51012 +/- ##
==========================================
+ Coverage 80.70% 80.91% +0.21%
==========================================
Files 1926 1926
Lines 210012 210020 +8
==========================================
+ Hits 169485 169933 +448
+ Misses 40527 40087 -440 |
BC-breaking note:
torch.svd() added support for complex inputs in PyTorch 1.7, but was not documented as doing so. The complex "V" tensor returned was actually the complex conjugate of what's expected. This PR fixes the discrepancy.
Note that this means this PR silently breaks all current users of torch.svd() with complex inputs.
Original PR Summary:
This PR resolves #45821.
The problem was that when introducing the support of complex inputs for
torch.svd
it was overlooked that LAPACK/MAGMA returns the conjugate transpose of V matrix, not just the transpose of V. Sotorch.svd
was silently returning U, S, V.conj() instead of U, S, V.Behavior of
torch.linalg.pinv
,torch.pinverse
andtorch.linalg.svd
(they depend ontorch.svd
) is not changed in this PR.