-
Notifications
You must be signed in to change notification settings - Fork 25.6k
[MTIA ATen Backend] Add dispatch key for div.out #156949
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Migrate div.out in tree Differential Revision: [D77063371](https://our.internmc.facebook.com/intern/diff/D77063371/) [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/156949
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit b90a9a7 with merge base 1e7e21e ( UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D77063371 |
Attention! native_functions.yaml was changedIf you are adding a new function or defaulted argument to native_functions.yaml, you cannot use it from pre-existing Python frontend code until our FC window passes (two weeks). Split your PR into two PRs, one which adds the new C++ functionality, and one that makes use of it from Python, and land them two weeks apart. See https://github.com/pytorch/pytorch/wiki/PyTorch's-Python-Frontend-Backward-and-Forward-Compatibility-Policy#forwards-compatibility-fc for more info. Caused by: |
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Starting merge as part of PR stack under #156950 |
Starting merge as part of PR stack under #156951 |
Starting merge as part of PR stack under #156952 |
2 similar comments
Starting merge as part of PR stack under #156952 |
Starting merge as part of PR stack under #156952 |
…t.out (#156950) Migrate fmod / abs.out / logical_not.out Differential Revision: [D77220217](https://our.internmc.facebook.com/intern/diff/D77220217/) Pull Request resolved: #156950 Approved by: https://github.com/malfet ghstack dependencies: #156944, #156945, #156946, #156947, #156948, #156949
… sub.out (#156951) Migrate rsub.Tensor / rsub.Scalar / sub.out Differential Revision: [D77015033](https://our.internmc.facebook.com/intern/diff/D77015033/) Pull Request resolved: #156951 Approved by: https://github.com/malfet ghstack dependencies: #156944, #156945, #156946, #156947, #156948, #156949, #156950
Migrate add.out Differential Revision: [D77352482](https://our.internmc.facebook.com/intern/diff/D77352482/) Pull Request resolved: #156952 Approved by: https://github.com/malfet, https://github.com/huydhn ghstack dependencies: #156944, #156945, #156946, #156947, #156948, #156949, #156950, #156951
Stack from ghstack (oldest at bottom):
Migrate div.out
Differential Revision: D77063371
cc @egienvalue