Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated derivative rules for complex QR decomposition #48489

Closed

Conversation

IvanYashchuk
Copy link
Collaborator

Updated qr_backward to work correctly for complex-valued inputs.
Added torch.qr to list of complex tests.

The previous implementation for real-valued differentiation used equation 42 from https://arxiv.org/abs/1001.1654
The current implementation is a bit simpler but the result for the real-valued input case is the same and all tests still pass.
Derivation of complex-valued QR differentiation https://giggleliu.github.io/2019/04/02/einsumbp.html

Ref. #33152

@IvanYashchuk IvanYashchuk added module: numpy Related to numpy support, and also numpy compatibility of our operators module: linear algebra Issues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmul complex_autograd labels Nov 26, 2020
@IvanYashchuk IvanYashchuk added the module: complex Related to complex number support in PyTorch label Nov 26, 2020
@IvanYashchuk IvanYashchuk removed the request for review from apaszke November 26, 2020 10:51
@dr-ci
Copy link

dr-ci bot commented Nov 26, 2020

💊 CI failures summary and remediations

As of commit 630991a (more details on the Dr. CI page):


  • 2/2 failures possibly* introduced in this PR
    • 1/2 non-CircleCI failure(s)

1 failure not recognized by patterns:

Job Step Action
CircleCI pytorch_linux_xenial_py3_clang5_mobile_build Set Up CI Environment After attach_workspace 🔁 rerun
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 17 times.

@mruberry mruberry added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Nov 27, 2020
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for looking into this and updating the formula!

There were discussions as well about the numerical stability of the current implementation as well as limitations for degenerate matrices. Does this new algorithm have the same limitations?

torch/csrc/autograd/FunctionsManual.cpp Show resolved Hide resolved
torch/csrc/autograd/FunctionsManual.cpp Outdated Show resolved Hide resolved
torch/csrc/autograd/FunctionsManual.cpp Outdated Show resolved Hide resolved
torch/csrc/autograd/FunctionsManual.cpp Show resolved Hide resolved
@IvanYashchuk
Copy link
Collaborator Author

There were discussions as well about the numerical stability of the current implementation as well as limitations for degenerate matrices. Does this new algorithm have the same limitations?

Yes, I think it carries the same set of limitations as the previous implementation because the new one still doesn't use pivoting or any other techniques for improving numerical stability.

@codecov
Copy link

codecov bot commented Dec 2, 2020

Codecov Report

Merging #48489 (02ef819) into master (c7b8f3e) will increase coverage by 55.81%.
The diff coverage is 100.00%.

@@             Coverage Diff             @@
##           master   #48489       +/-   ##
===========================================
+ Coverage   24.94%   80.76%   +55.81%     
===========================================
  Files         467     1867     +1400     
  Lines       58746   201579   +142833     
===========================================
+ Hits        14656   162809   +148153     
+ Misses      44090    38770     -5320     

Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@albanD
Copy link
Collaborator

albanD commented Dec 3, 2020

@IvanYashchuk this has merge conflict with master because the other PR landed first and modified the exact same lines.
Could you rebase on top of master please?

@IvanYashchuk
Copy link
Collaborator Author

@albanD, done.

@albanD
Copy link
Collaborator

albanD commented Dec 3, 2020

Thanks for the quick rebase! The lint is not happy with the extra space in the test file though

@IvanYashchuk
Copy link
Collaborator Author

Oh, should be fixed now.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@albanD
Copy link
Collaborator

albanD commented Dec 3, 2020

Thanks!

@IvanYashchuk
Copy link
Collaborator Author

Resolved another merge conflict.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@albanD
Copy link
Collaborator

albanD commented Dec 10, 2020

@IvanYashchuk sorry about this one, it keeps failing to land and then have merge conflicts...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed complex_autograd Merged module: complex Related to complex number support in PyTorch module: linear algebra Issues related to specialized linear algebra operations in PyTorch; includes matrix multiply matmul module: numpy Related to numpy support, and also numpy compatibility of our operators open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants