-
Notifications
You must be signed in to change notification settings - Fork 25.3k
Dispatch the auxiliary frobenius_norm and nuclear_norm to better implementations and deprecate them #81763
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ementations and deprecate them These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all [ghstack-poisoned]
🔗 Helpful links
❌ 30 New FailuresAs of commit 3df05c6 (more details on the Dr. CI page): Expand to see more
🕵️ 30 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages
|
…ementations and deprecate them These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all ghstack-source-id: d269c3d Pull Request resolved: #81763
About BC compatibility. We run the same test as described in #81761 (comment) |
…better implementations and deprecate them" These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all [ghstack-poisoned]
…ementations and deprecate them These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all ghstack-source-id: c4c9f3f Pull Request resolved: #81763
…better implementations and deprecate them" These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all [ghstack-poisoned]
…better implementations and deprecate them" These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all [ghstack-poisoned]
…ementations and deprecate them These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all ghstack-source-id: 13dfc6d Pull Request resolved: #81763
…better implementations and deprecate them" These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all [ghstack-poisoned]
…better implementations and deprecate them" These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all [ghstack-poisoned]
@pytorchbot rebase |
@pytorchbot successfully started a rebase job. Check the current status here |
Rebase failed due to
Raised by https://github.com/pytorch/pytorch/actions/runs/3914524063 |
@lezcano Could you rebase? |
…better implementations and deprecate them" These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all cc jianyuh nikitaved pearu mruberry walterddr IvanYashchuk xwang233 Lezcano ezyang ngimel peterbell10 Differential Revision: [D42354776](https://our.internmc.facebook.com/intern/diff/D42354776) [ghstack-poisoned]
implementations and deprecate them These functions will be legacy functions. We deprecate them, but we also take this chance to dispatch to a more efficient and consistent implementation. Doing so should help writing a conversion rule for these to be able to remove them once and for all ghstack-source-id: f3533f9 Pull Request resolved: #81763
@kit1980 done! |
@kit1980 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
@pytorchbot revert -m 'This breaks XLA test https://hud.pytorch.org/pytorch/pytorch/commit/34e8eb229db76f7f5eb8f18c062dbd1ee47f8b12' -c landrace |
@pytorchbot successfully started a revert job. Check the current status here. |
Can't revert PR that was landed via phabricator as D42354776. Please revert by going to the internal diff and clicking Unland. |
To be reverted once the issue is mitigated https://hud.pytorch.org/failure/%5B%20%20FAILED%20%20%5D%20AtenXlaTensorTest.TestFrobeniusNormInDims Caused by #81763 Pull Request resolved: #92634 Approved by: https://github.com/ZainRizvi
Here is the XLA test that needs to be updated: https://github.com/pytorch/xla/blob/master/test/cpp/test_aten_xla_tensor.cpp#L1887 |
Or we can just delete the test as did pytorch/xla#3976 |
uh, no, we should fix that. Odd that that happened, ci was green... |
Right, this is an issue with some new symint stuff. I'll look into that. |
@lezcano don't reopen this PR, we didn't actually revert this. |
Sure thing, will do that on Monday. |
See pytorch/pytorch#81763 (comment) and pytorch/pytorch#81763 (comment) Now FrobeniusNorm is just left for backwards compatibility, and it dispatches to `at::norm`, which dispatches to `xla::norm`. As such, we don't need the `sqrt` counters as they will not be updated. We remove these checks as those from `TestNorm*` do not have them either.
See pytorch/pytorch#81763 (comment) and pytorch/pytorch#81763 (comment) Now FrobeniusNorm is just left for backwards compatibility, and it dispatches to `at::norm`, which dispatches to `xla::norm`. As such, we don't need the `sqrt` counters as they will not be updated. We remove these checks as those from `TestNorm*` do not have them either.
Stack from ghstack (oldest at bottom):
These functions will be legacy functions. We deprecate them, but we also
take this chance to dispatch to a more efficient and consistent implementation.
Doing so should help writing a conversion rule for these to be able to
remove them once and for all
cc @jianyuh @nikitaved @pearu @mruberry @walterddr @IvanYashchuk @xwang233 @lezcano @ezyang @ngimel @peterbell10
Differential Revision: D42354776