Skip to content

Modify symmetric eigendecomposition derivative #23018

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 9 commits into from

Conversation

vishwakftw
Copy link
Contributor

@vishwakftw vishwakftw commented Jul 18, 2019

The derivative of the symmetric eigendecomposition was previously a triangular matrix.

Changelog:

  • Modify the derivative of symeig from a triangular matrix to a symmetric matrix with reason specified as a comment.

Test Plan:

  • Existing gradcheck and gradgradchecks are ported to test_autograd to verify that the change is correct. Input to symeig is symmetrized before passing

The derivative of the symmetric eigendecomposition was previously a triangular matrix.

Changelog:
- Modify the derivative of symeig from a triangular matrix to a symmetric matrix.

Test Plan:
- Existing gradcheck and gradgradchecks are ported to test_autograd to verify that the change is correct. Input to symeig is symmetrized before passing
@pytorchbot pytorchbot added module: autograd Related to torch.autograd, and the autograd engine in general module: internals Related to internal abstractions in c10 and ATen module: tests Issues related to tests (not the torch.testing module) labels Jul 18, 2019
@vishwakftw vishwakftw requested a review from fmassa July 18, 2019 14:18
@vishwakftw
Copy link
Contributor Author

cc: @calincru

@jeffreyksmithjr
Copy link
Contributor

I get what was done, @vishwakftw , but I think some folks might benefit from even a single sentence about why it was done. Could you clarify?

@vishwakftw
Copy link
Contributor Author

@jeffreyksmithjr I have added a comment in Functions.cpp about this change. Hope it clarifies the reason for the change.

@calincru
Copy link

Thanks for solving this, @vishwakftw!

@zhangguanheng66 zhangguanheng66 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Jul 18, 2019
@vishwakftw
Copy link
Contributor Author

@pytorchbot rebase this please

@vishwakftw
Copy link
Contributor Author

@pytorchbot rebase this please

@vishwakftw
Copy link
Contributor Author

@pytorchbot rebase this please

@vishwakftw
Copy link
Contributor Author

@pytorchbot rebase this please

@vishwakftw
Copy link
Contributor Author

@fmassa this is ready for review.

@vishwakftw
Copy link
Contributor Author

@pytorchbot rebase this please

@vishwakftw
Copy link
Contributor Author

@pytorchbot rebase this please

@ezyang ezyang removed the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Aug 16, 2019
@ezyang ezyang removed the request for review from fmassa August 16, 2019 14:39
@ezyang
Copy link
Contributor

ezyang commented Aug 16, 2019

@fmassa is on vacation so I'm unsubscribing him from this issue and kicking this back into triage

@vishwakftw
Copy link
Contributor Author

Thanks!

@vishwakftw
Copy link
Contributor Author

@pytorchbot rebase this please

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@ezyang merged this pull request in 9228dd7.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: autograd Related to torch.autograd, and the autograd engine in general module: internals Related to internal abstractions in c10 and ATen module: tests Issues related to tests (not the torch.testing module) open source
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants