-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Register Triton scaled dot production in aten (#82509) #82758
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful links
✅ No Failures (4 Pending)As of commit 8f89d58 (more details on the Dr. CI page): Expand to see more💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
|
This pull request was exported from Phabricator. Differential Revision: D37897092 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Let's get Triton in. We really should refactor most of the BetterTransformer MHA though so we don't have to copy all over the place. But that's something for another day.
|
This pull request was exported from Phabricator. Differential Revision: D37897092 |
c708751 to
04f370d
Compare
|
@pytorchbot merge -g |
|
This pull request was exported from Phabricator. Differential Revision: D37897092 |
04f370d to
e63f0dd
Compare
|
@pytorchbot successfully started a merge job. Check the current status here |
|
Merge failed due to Refusing to merge as mandatory check(s) pull failed for rule superuser |
|
@pytorchbot merge |
|
@pytorchbot successfully started a merge job. Check the current status here |
|
Merge failed due to Refusing to merge as mandatory check(s) pull failed for rule superuser |
|
@zrphercule has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
…h#82509) Summary: This diff creates a aten Triton path along with a dummy aten scaled dot attention operator Pull Request resolved: pytorch#82509 Pull Request resolved: pytorch#82758 Test Plan: CI, since the operator is dummy (not to-be registered in python), there is no way to test its correctness in this diff Reviewed By: erichan1, ngimel Pulled By: zrphercule zrphercule Differential Revision: D37897092 fbshipit-source-id: 037c98afdc8a67364cf077d983e23b159841c234
|
This pull request was exported from Phabricator. Differential Revision: D37897092 |
e63f0dd to
8f89d58
Compare
|
@pytorchbot merge |
|
@pytorchbot successfully started a merge job. Check the current status here |
Summary: This diff creates a aten Triton path along with a dummy aten scaled dot attention operator Pull Request resolved: #82509 #82758 Approved by: https://github.com/erichan1 Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/dd838cee0df98836bc14a9d7976c097f12f8d438 Test plan from GitHub: CI, since the operator is dummy (not to-be registered in python), there is no way to test its correctness in this diff Reviewed By: kit1980, erichan1, ngimel Differential Revision: D37897092 Pulled By: zrphercule fbshipit-source-id: 7fb2d4f58079d4db4dc4dc2ef7a2e6feb1871d8e
Summary:
This diff creates a aten Triton path along with a dummy aten scaled dot attention operator
Pull Request resolved: #82509
Test Plan: CI, since the operator is dummy (not to-be registered in python), there is no way to test its correctness in this diff
Reviewed By: erichan1, ngimel
Differential Revision: D37897092
Pulled By: zrphercule