New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mkldnn gelu #53615
mkldnn gelu #53615
Conversation
💊 CI failures summary and remediationsAs of commit 2821f7d (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
aten/src/ATen/native/mkldnn/Relu.cpp
Outdated
Tensor mkldnn_gelu(const Tensor& input) { | ||
if (input.scalar_type() == ScalarType::BFloat16) { | ||
TORCH_CHECK(mkldnn_bf16_device_check(), | ||
"mkldnn_relu: bf16 path needs the cpu support avx512bw, avx512vl and avx512dq"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it ok, that the function is named mkldnn_gelu
and this message says mkldnn_relu
? typo?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[IGNORE] [Testing if im authorized reviewer]
@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Codecov Report
@@ Coverage Diff @@
## master #53615 +/- ##
==========================================
+ Coverage 77.70% 77.71% +0.01%
==========================================
Files 1959 1959
Lines 195300 195324 +24
==========================================
+ Hits 151755 151805 +50
+ Misses 43545 43519 -26 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add inplace gelu as well ?
@Krovatkin merged this pull request in aeaa91b. |
Summary: Fixes #{issue number} Pull Request resolved: pytorch#53615 Reviewed By: anjali411 Differential Revision: D28154396 Pulled By: Krovatkin fbshipit-source-id: 7a9d4d37dc06e54e3249c531a034667b5a2afc46
Summary: Enable Gelu bf16/fp32 in CPU path using Mkldnn implementation. User doesn't need to_mkldnn() explicitly. New Gelu fp32 performs better than original one. Add Gelu backward for #53615. Pull Request resolved: #58525 Reviewed By: ejguan Differential Revision: D29940369 Pulled By: ezyang fbshipit-source-id: df9598262ec50e5d7f6e96490562aa1b116948bf
Fixes #{issue number}