Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate away from CustomFuseGraph #3403

Closed
wants to merge 1 commit into from

Conversation

@zrphercule
Copy link
Member

commented Aug 8, 2019

This is basiclly the glow version of pytorch/tvm#72
Will not use PyTorch's customFuseNode anymore.

Will add comment indicate the copied code and fix the lint once finished.
Please dont give detailed review until WIP is removed, but feel free to leave any big-scope opinion.

@zrphercule zrphercule requested review from jackm321 and yinghai and removed request for jackm321 Aug 8, 2019

// 1) Both are in-place ops
// 2) Consumer is in-place, producer !hasInputWriters
// 3) Producer is in-place, consumer !hasOutputWriters
REQ(aliasDb.moveAfterTopologicallyValid(consumer, producer));

This comment has been minimized.

Copy link
@jackm321

jackm321 Aug 8, 2019

Contributor

it looks like this actually moves the consumer after producer, is there a reason we want to do this and not just see if it's possible with couldMoveAfterTopologically

This comment has been minimized.

Copy link
@jackm321

This comment has been minimized.

Copy link
@zrphercule

zrphercule Aug 9, 2019

Author Member

Guess there is not

@zrphercule zrphercule force-pushed the zrphercule:custom_node branch from a4a29e4 to 3db0698 Aug 12, 2019

@zrphercule zrphercule changed the title [WIP] Migrate away from CustomFuseGraph Migrate away from CustomFuseGraph Aug 12, 2019

@zrphercule

This comment has been minimized.

Copy link
Member Author

commented Aug 12, 2019

migrating to a seperate file now...

@yinghai
Copy link
Contributor

left a comment

LGTM

torch_glow/src/CMakeLists.txt Outdated Show resolved Hide resolved
@facebook-github-bot
Copy link

left a comment

@zrphercule has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

accept_all_ops = False
if (expected_fused_ops == None):
expected_fused_ops = []
accept_all_ops = True

This comment has been minimized.

Copy link
@jackm321

jackm321 Aug 13, 2019

Contributor

I don't think this is a good idea. The point of expected_fused_ops is to check that the things that should be getting fused are getting fused. Why do we need to have this wildcard?

This comment has been minimized.

Copy link
@zrphercule

zrphercule Aug 13, 2019

Author Member

It is because we want to use jitVsGlow not only here in the unit test, but also in, like, testing xray (actually I did have a script testing xray model locally using jiVsGlow). we cannot indicate all ops in a big model like xray easily.
My thinking is, this functionality is like an extra. If we want to check it, then good we will have it to be checked. Else we just dont check it.

This comment has been minimized.

Copy link
@jackm321

jackm321 Aug 13, 2019

Contributor

Ok well I guess my two thoughts are that

  1. I'd like to make sure all the operator tests use this so having operator checking be on by default (as opposed to this where it's basically off by default) would be preferred (maybe we can pass an extra bool flag to disable this?)
  2. Shouldn't we be checking the ops in the xray model anyways? We want to make sure we're running what we expect to be running.

This comment has been minimized.

Copy link
@zrphercule

zrphercule Aug 13, 2019

Author Member

I agree with you for both two opinions. So our decision is:

  1. A seperate by_default=False param to control if all ops should be accepted.
  2. Once we have an official bigger model unit tests in our code base, we should also have a list of expected ops ( of course future operator unit tests should have this as well)
    Any comments?

This comment has been minimized.

Copy link
@jackm321

jackm321 Aug 13, 2019

Contributor

That sounds good to me

@@ -18,10 +18,17 @@
#define GLOW_TORCH_GLOW_SRC_FUSINGOPTIMIZER_H

This comment has been minimized.

Copy link
@jackm321

jackm321 Aug 13, 2019

Contributor

please change this too since the file was moved

@jackm321

This comment has been minimized.

Copy link
Contributor

commented Aug 13, 2019

Looking great @zrphercule! Just a couple of comments

@zrphercule zrphercule force-pushed the zrphercule:custom_node branch from ac29220 to 80e2a0a Aug 13, 2019

zrphercule added a commit to zrphercule/glow that referenced this pull request Aug 13, 2019

Migrate away from CustomFuseGraph (pytorch#3403)
Summary:
This is basiclly the glow version of pytorch/tvm#72
Will not use PyTorch's customFuseNode anymore.

Will add comment indicate the copied code and fix the lint once finished.
Please dont give detailed review until WIP is removed, but feel free to leave any big-scope opinion.
Pull Request resolved: pytorch#3403

Differential Revision: D16775646

fbshipit-source-id: 7b06e24e5a76e5dcf89a72945c5806df860912ac
@jackm321
Copy link
Contributor

left a comment

LGTM!

@zrphercule zrphercule force-pushed the zrphercule:custom_node branch from 80e2a0a to 3e3658d Aug 13, 2019

zrphercule added a commit to zrphercule/glow that referenced this pull request Aug 13, 2019

Migrate away from CustomFuseGraph (pytorch#3403)
Summary:
This is basiclly the glow version of pytorch/tvm#72
Will not use PyTorch's customFuseNode anymore.

Will add comment indicate the copied code and fix the lint once finished.
Please dont give detailed review until WIP is removed, but feel free to leave any big-scope opinion.
Pull Request resolved: pytorch#3403

Differential Revision: D16775646

fbshipit-source-id: 078fa5a9f8a29a1f3064342e05414bdb2af233c2

@zrphercule zrphercule force-pushed the zrphercule:custom_node branch from 3e3658d to 8896fa8 Aug 13, 2019

zrphercule added a commit to zrphercule/glow that referenced this pull request Aug 13, 2019

Migrate away from CustomFuseGraph (pytorch#3403)
Summary:
This is basiclly the glow version of pytorch/tvm#72
Will not use PyTorch's customFuseNode anymore.

Will add comment indicate the copied code and fix the lint once finished.
Please dont give detailed review until WIP is removed, but feel free to leave any big-scope opinion.
Pull Request resolved: pytorch#3403

Differential Revision: D16775646

fbshipit-source-id: 337128e8612a8ff4f1efe90e46fc57e73ebdb3f3

@zrphercule zrphercule force-pushed the zrphercule:custom_node branch from 8896fa8 to 982e8db Aug 13, 2019

zrphercule added a commit to zrphercule/glow that referenced this pull request Aug 13, 2019

Migrate away from CustomFuseGraph (pytorch#3403)
Summary:
This is basiclly the glow version of pytorch/tvm#72
Will not use PyTorch's customFuseNode anymore.

Will add comment indicate the copied code and fix the lint once finished.
Please dont give detailed review until WIP is removed, but feel free to leave any big-scope opinion.
Pull Request resolved: pytorch#3403

Differential Revision: D16775646

fbshipit-source-id: 741cc359645b20d9871677b84e855bd3bc6517ce

@zrphercule zrphercule force-pushed the zrphercule:custom_node branch from 982e8db to 9329d1f Aug 13, 2019

zrphercule added a commit to zrphercule/glow that referenced this pull request Aug 13, 2019

Migrate away from CustomFuseGraph (pytorch#3403)
Summary:
This is basiclly the glow version of pytorch/tvm#72
Will not use PyTorch's customFuseNode anymore.

Will add comment indicate the copied code and fix the lint once finished.
Please dont give detailed review until WIP is removed, but feel free to leave any big-scope opinion.
Pull Request resolved: pytorch#3403

Differential Revision: D16775646

fbshipit-source-id: 0526a7767186f2107516f5a10808f5f82de03c81
Migrate away from CustomFuseGraph (#3403)
Summary:
This is basiclly the glow version of pytorch/tvm#72
Will not use PyTorch's customFuseNode anymore.

Will add comment indicate the copied code and fix the lint once finished.
Please dont give detailed review until WIP is removed, but feel free to leave any big-scope opinion.
Pull Request resolved: #3403

Differential Revision: D16775646

fbshipit-source-id: 90873346feff60876602473b303a7883a1370b26

@zrphercule zrphercule force-pushed the zrphercule:custom_node branch from 9329d1f to 87f8b32 Aug 13, 2019

@zrphercule zrphercule deleted the zrphercule:custom_node branch Aug 13, 2019

@facebook-github-bot

This comment has been minimized.

Copy link

commented Aug 13, 2019

@zrphercule merged this pull request in 3787ca3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants
You can’t perform that action at this time.