New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPU fusion code cleanup: Extract GpuInstructionFusion::IsFusible into gpu_fusible.cc #25842
Conversation
… gpu_fusible.cc Tested using //tensorflow/compiler/xla/service/gpu/tests:gpu_fusion_test
Thanks for your pull request. It looks like this may be your first contribution to a Google open source project (if not, look below for help). Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). 📝 Please visit https://cla.developers.google.com/ to sign. Once you've signed (or fixed any issues), please reply here (e.g. What to do if you already signed the CLAIndividual signers
Corporate signers
Googlers can find more info about SignCLA and this PR by following this link. |
I have signed the CLA |
@sana-damani please sign CLA |
https://cla.developers.google.com/clas shows that I signed it on Feb 18.
Sana
…On Tue, Feb 19, 2019 at 1:13 PM rthadur ***@***.***> wrote:
@sana-damani <https://github.com/sana-damani> please sign CLA
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#25842 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AUshHWrvxoAgR4Ocklk6SAAIi7rssJxDks5vPD6ugaJpZM4bBBWw>
.
|
Tested using //tensorflow/compiler/xla/service/gpu/tests:gpu_fusion_test and //tensorflow/python/compiler
@rthadur Any idea what we're waiting on here? |
Looks like it needs another review before it can be merged. This is likely because I committed a new change before the merge was done. |
@rthadur I still see the "Need a CLA for one or more commit authors" line even though @sana-damani says they've signed the CLA. |
@sanjoy i believe there were multiple people who contributed to this PR , @sana-damani do you know if there is one more contributor for this PR ? |
I was the only contributor.
…On Thu, Feb 21, 2019, 1:04 PM rthadur ***@***.***> wrote:
@rthadur <https://github.com/rthadur> I still see the "Need a CLA for one
or more commit authors" line even though @sana-damani
<https://github.com/sana-damani> says they've signed the CLA.
@sanjoy <https://github.com/sanjoy> i believe there were multiple people
who contributed to this PR , @sana-damani <https://github.com/sana-damani>
do you know if there is one more contributor for this PR ?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#25842 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AUshHYW7E5bKGOe45Cr_G-fKMgzsolorks5vPt-hgaJpZM4bBBWw>
.
|
Hi @sana-damani, did you use the same username and email address for your commits? I see "sdamani" as the username on the commits. |
Thanks for pointing that out! I'll try and fix the commit author.
…On Thu, Feb 21, 2019 at 1:41 PM Yifei Feng ***@***.***> wrote:
Hi @sana-damani <https://github.com/sana-damani>, did you use the same
username and email address for your commits? I see "sdamani" as the
username on the commits.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#25842 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AUshHcShGdnloKu4Ny9_7HwDAjaKgSq4ks5vPuhPgaJpZM4bBBWw>
.
|
I have signed the CLA with the updated username |
A Googler has manually verified that the CLAs look good. (Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.) ℹ️ Googlers: Go here for more info. |
Is the Windows Bazel GPU build failure a real failure? If so, is there a way that I can reproduce and debug this failure? |
I don't think the windows failure is related. |
bool IsInputFusible(const HloInstruction& instr) { | ||
// Input fusion only handles non-elemental reduction and scatter operations. | ||
return IsInputFusibleReduction(instr) || | ||
instr.opcode() == HloOpcode::kScatter; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The function works for unfused Scatter ops but not for existing Scatter input fusions. You want to add something like this.
(instr.opcode() == HloOpcode::kFusion &&
instr.fusion_kind() == HloInstruction::FusionKind::kInput &&
instr.fused_expression_root()->opcode() == HloOpcode::kScatter)
@@ -150,8 +150,7 @@ bool IsLoopFusible(const HloInstruction& instr) { | |||
instr.opcode() == HloOpcode::kConcatenate || | |||
instr.opcode() == HloOpcode::kDynamicSlice || | |||
instr.opcode() == HloOpcode::kDynamicUpdateSlice || | |||
(instr.opcode() == HloOpcode::kFusion && | |||
instr.fusion_kind() == HloInstruction::FusionKind::kLoop) || | |||
instr.opcode() == HloOpcode::kFusion || |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Existing input fusions are not loop-fusible, hence we should return false as we did before. Please update IsInputFusible as suggested in my other comment instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about output and custom fusion nodes? Should they also be disallowed from loop fusion?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that's a good point. We want to check for FusionKind::kLoop here.
So there's good news and bad news. 👍 The good news is that everyone that needs to sign a CLA (the pull request submitter and all commit authors) have done so. Everything is all good there. 😕 The bad news is that it appears that one or more commits were authored or co-authored by someone other than the pull request submitter. We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that here in the pull request. Note to project maintainer: This is a terminal state, meaning the ℹ️ Googlers: Go here for more info. |
A Googler has manually verified that the CLAs look good. (Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.) ℹ️ Googlers: Go here for more info. |
@@ -46,6 +55,10 @@ bool IsReduceInputFusion(const HloInstruction& instr); | |||
// is either an unfused reduction-to-vector op or a reduce input fusion. | |||
bool IsInputFusibleReduction(const HloInstruction& instr); | |||
|
|||
// Whether `instr` is fusible as root of a reduce input fusions, i.e. `instr` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please replace "reduce" with "scatter".
So there's good news and bad news. 👍 The good news is that everyone that needs to sign a CLA (the pull request submitter and all commit authors) have done so. Everything is all good there. 😕 The bad news is that it appears that one or more commits were authored or co-authored by someone other than the pull request submitter. We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that here in the pull request. Note to project maintainer: This is a terminal state, meaning the ℹ️ Googlers: Go here for more info. |
A Googler has manually verified that the CLAs look good. (Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.) ℹ️ Googlers: Go here for more info. |
PiperOrigin-RevId: 236829871
No description provided.