New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[INTEL MKL] Add MKL Conv + Bias + LeakyRelu Fusion #42856
[INTEL MKL] Add MKL Conv + Bias + LeakyRelu Fusion #42856
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the PR! I have a few comments.
Note to self: Don't pull this in until #42173 is merged.
@@ -1588,6 +1589,7 @@ REGISTER_TEST_ALL_TYPES(NodeRewrite_FusedConv2D_Positive1); | |||
"i:1, i:1} } }" \ | |||
" attr { key: 'fused_ops' value { list: {s: 'Relu'} } }" \ | |||
" attr { key: 'epsilon' value { f: 0.001 }}" \ | |||
" attr { key: 'leakyrelu_alpha' value { f: 0.2 }}" \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: would you mind moving the \
at the end to align with the rest of the lines? Thank you!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The \
are aligned.
auto activate = s.WithOpName("activation"); | ||
auto fetch = s.WithOpName("fetch"); | ||
if (activation == "Relu") { | ||
ops::Identity(fetch, ops::Relu(activate, add)); | ||
} else if (activation == "Relu6") { | ||
ops::Identity(fetch, ops::Relu6(activate, add)); | ||
} else if (activation == "Elu") { | ||
ops::Identity(fetch, ops::Elu(activate, add)); | ||
} else if (activation == "LeakyRelu") { | ||
ops::Identity(fetch, ops::internal::LeakyRelu(activate, add)); | ||
} else { | ||
ops::Identity(s.WithOpName("fetch"), add); | ||
DCHECK(activation == "None"); | ||
ops::Identity(fetch, add); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The logic is the same in all three cases (AddN, AddV2, Add
). Can we refactor this to a function (or a lambda function)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the advice. I rewrite this part of codes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the quick fixes!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#42173 is merged. Could you please resolve the conflicts? Thank you!
The conflicts are fixed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you again for the PR!
Add Conv + Bias + LeakyRelu fusion MKL implementation