Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mkldnn gelu #53615

Closed
wants to merge 5 commits into from
Closed

mkldnn gelu #53615

wants to merge 5 commits into from

Conversation

Krovatkin
Copy link
Contributor

Fixes #{issue number}

@facebook-github-bot facebook-github-bot added cla signed oncall: jit Add this issue/PR to JIT oncall triage queue labels Mar 9, 2021
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Mar 9, 2021

💊 CI failures summary and remediations

As of commit 2821f7d (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

Tensor mkldnn_gelu(const Tensor& input) {
if (input.scalar_type() == ScalarType::BFloat16) {
TORCH_CHECK(mkldnn_bf16_device_check(),
"mkldnn_relu: bf16 path needs the cpu support avx512bw, avx512vl and avx512dq");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it ok, that the function is named mkldnn_gelu and this message says mkldnn_relu? typo?

Copy link
Contributor

@eellison eellison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[IGNORE] [Testing if im authorized reviewer]

@Krovatkin Krovatkin requested a review from ezyang as a code owner May 3, 2021 17:27
@Krovatkin Krovatkin changed the title [WIP] mkldnn gelu mkldnn gelu May 3, 2021
@facebook-github-bot
Copy link
Contributor

@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@codecov
Copy link

codecov bot commented May 4, 2021

Codecov Report

Merging #53615 (2821f7d) into master (264d879) will increase coverage by 0.01%.
The diff coverage is 86.20%.

@@            Coverage Diff             @@
##           master   #53615      +/-   ##
==========================================
+ Coverage   77.70%   77.71%   +0.01%     
==========================================
  Files        1959     1959              
  Lines      195300   195324      +24     
==========================================
+ Hits       151755   151805      +50     
+ Misses      43545    43519      -26     

Copy link
Contributor

@eellison eellison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add inplace gelu as well ?

@facebook-github-bot
Copy link
Contributor

@Krovatkin merged this pull request in aeaa91b.

krshrimali pushed a commit to krshrimali/pytorch that referenced this pull request May 19, 2021
Summary:
Fixes #{issue number}

Pull Request resolved: pytorch#53615

Reviewed By: anjali411

Differential Revision: D28154396

Pulled By: Krovatkin

fbshipit-source-id: 7a9d4d37dc06e54e3249c531a034667b5a2afc46
facebook-github-bot pushed a commit that referenced this pull request Aug 3, 2021
Summary:
Enable Gelu bf16/fp32 in CPU path using Mkldnn implementation. User doesn't need to_mkldnn() explicitly. New Gelu fp32 performs better than original one.

Add Gelu backward for #53615.

Pull Request resolved: #58525

Reviewed By: ejguan

Differential Revision: D29940369

Pulled By: ezyang

fbshipit-source-id: df9598262ec50e5d7f6e96490562aa1b116948bf
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged oncall: jit Add this issue/PR to JIT oncall triage queue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants