Skip to content

Conversation

zhuhaozhe
Copy link
Collaborator

@zhuhaozhe zhuhaozhe commented Mar 31, 2023

Stack from ghstack (oldest at bottom):

Enable data type propagation in schedule node level.
Propagation policy:
(1) ops with dtype args [constant, load, rand, randn] -> direct use dtype as node dtype
(2) ops semantics decide output dtype -> using output dtype
All override_return_dtype in https://github.com/pytorch/pytorch/blob/master/torch/_inductor/lowering.py.
(3) other ops: perform promote on input nodes dtype. ADD(BF16, FP32) -> FP32 output.

cc @soumith @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @peterbell10 @desertfire

@pytorch-bot
Copy link

pytorch-bot bot commented Mar 31, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/98065

Note: Links to docs will display an error until the docs builds have been completed.

❌ 2 Failures

As of commit dc5731a:

BROKEN TRUNK - The following jobs failed but were present on the merge base 148d492:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

[ghstack-poisoned]
@zhuhaozhe zhuhaozhe marked this pull request as draft March 31, 2023 07:42
@zhuhaozhe zhuhaozhe changed the title enable data type propogation [WIP] enable data type propogation Mar 31, 2023
@zhuhaozhe zhuhaozhe changed the title [WIP] enable data type propogation [WIP] enable data type propagation Mar 31, 2023
@zhuhaozhe zhuhaozhe requested review from EikanWang and jgong5 March 31, 2023 07:47
@zhuhaozhe zhuhaozhe added the topic: not user facing topic category label Mar 31, 2023
zhuhaozhe added a commit that referenced this pull request Mar 31, 2023
ghstack-source-id: 6719487
Pull Request resolved: #98065
Attach data type information inside scheduling nodes.
Just leverage CI to check whether this attaching will impact UT now.
Plan to do:
(1) Understand and resolve data type propagation related to "masked".
(2) Add debug print to check the data types after data type propagation.
(3) Write UT based on (1).




cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
zhuhaozhe added a commit that referenced this pull request Apr 3, 2023
ghstack-source-id: 8fd5604
Pull Request resolved: #98065
Attach data type information inside scheduling nodes.
Just leverage CI to check whether this attaching will impact UT now.
Plan to do:
(1) Understand and resolve data type propagation related to "masked".
(2) Add debug print to check the data types after data type propagation.
(3) Write UT based on (1).




cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@zhuhaozhe
Copy link
Collaborator Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a rebase job. Check the current status here

Attach data type information inside scheduling nodes.
Just leverage CI to check whether this attaching will impact UT now.
Plan to do:
(1) Understand and resolve data type propagation related to "masked".
(2) Add debug print to check the data types after data type propagation.
(3) Write UT based on (2).




cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@pytorchmergebot
Copy link
Collaborator

Successfully rebased gh/zhuhaozhe/10/orig onto refs/remotes/origin/viable/strict, please pull locally before adding more changes (for example, via ghstack checkout https://github.com/pytorch/pytorch/pull/98065)

pytorchmergebot pushed a commit that referenced this pull request Apr 4, 2023
ghstack-source-id: 20a51e7
Pull Request resolved: #98065
zhuhaozhe added a commit that referenced this pull request Apr 6, 2023
ghstack-source-id: 8902a28
Pull Request resolved: #98065
Attach data type information inside scheduling nodes.
Just leverage CI to check whether this attaching will impact UT now.
Plan to do:
(1) Understand and resolve data type propagation related to "masked".
(2) Add debug print to check the data types after data type propagation.
(3) Write UT based on (2).




cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
Attach data type information inside scheduling nodes.
Just leverage CI to check whether this attaching will impact UT now.
Plan to do:
(1) Understand and resolve data type propagation related to "masked".
(2) Add debug print to check the data types after data type propagation.
(3) Write UT based on (2).




cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@zhuhaozhe zhuhaozhe changed the title [WIP] enable data type propagation enable data type propagation Apr 7, 2023
@zhuhaozhe zhuhaozhe marked this pull request as ready for review April 7, 2023 02:15
@zhuhaozhe zhuhaozhe requested a review from XiaobingSuper April 7, 2023 02:23
@EikanWang
Copy link
Collaborator

Have you validated this PR with E2E models?

Enable data type propagation in schedule node level.
Propagation policy:
(1) ops with dtype args [constant, load, rand, randn] -> direct use dtype as node dtype
(2) ops semantics decide output dtype -> using output dtype
All `override_return_dtype` in https://github.com/pytorch/pytorch/blob/master/torch/_inductor/lowering.py.
(3) other ops: perform promote on input nodes dtype. ADD(BF16, FP32) -> FP32 output.

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
zhuhaozhe added a commit that referenced this pull request Apr 14, 2023
ghstack-source-id: 8429d3f
Pull Request resolved: #98065
Enable data type propagation in schedule node level.
Propagation policy:
(1) ops with dtype args [constant, load, rand, randn] -> direct use dtype as node dtype
(2) ops semantics decide output dtype -> using output dtype
All `override_return_dtype` in https://github.com/pytorch/pytorch/blob/master/torch/_inductor/lowering.py.
(3) other ops: perform promote on input nodes dtype. ADD(BF16, FP32) -> FP32 output.

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@zhuhaozhe zhuhaozhe requested a review from jgong5 April 14, 2023 07:06
zhuhaozhe added a commit that referenced this pull request Apr 14, 2023
ghstack-source-id: ea0916a
Pull Request resolved: #98065
Enable data type propagation in schedule node level.
Propagation policy:
(1) ops with dtype args [constant, load, rand, randn] -> direct use dtype as node dtype
(2) ops semantics decide output dtype -> using output dtype
All `override_return_dtype` in https://github.com/pytorch/pytorch/blob/master/torch/_inductor/lowering.py.
(3) other ops: perform promote on input nodes dtype. ADD(BF16, FP32) -> FP32 output.

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@zhuhaozhe
Copy link
Collaborator Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Apr 14, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: Command git -C /home/runner/work/pytorch/pytorch cherry-pick -x 60a8a56f1a5a80600d22a55f4d824c9b0f26e23a returned non-zero exit code 1

Auto-merging test/inductor/test_torchinductor.py
CONFLICT (content): Merge conflict in test/inductor/test_torchinductor.py
error: could not apply 60a8a56f1a5... enable data type propogation
hint: After resolving the conflicts, mark them with
hint: "git add/rm <pathspec>", then run
hint: "git cherry-pick --continue".
hint: You can instead skip this commit with "git cherry-pick --skip".
hint: To abort and get back to the state before "git cherry-pick",
hint: run "git cherry-pick --abort".
Details for Dev Infra team Raised by workflow job

zhuhaozhe added a commit that referenced this pull request Apr 17, 2023
ghstack-source-id: 42c0a7f
Pull Request resolved: #98065
Enable data type propagation in schedule node level.
Propagation policy:
(1) ops with dtype args [constant, load, rand, randn] -> direct use dtype as node dtype
(2) ops semantics decide output dtype -> using output dtype
All `override_return_dtype` in https://github.com/pytorch/pytorch/blob/master/torch/_inductor/lowering.py.
(3) other ops: perform promote on input nodes dtype. ADD(BF16, FP32) -> FP32 output.

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@EikanWang
Copy link
Collaborator

@zhuhaozhe , please check the failed cases.

@zhuhaozhe
Copy link
Collaborator Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a rebase job. Check the current status here

Enable data type propagation in schedule node level.
Propagation policy:
(1) ops with dtype args [constant, load, rand, randn] -> direct use dtype as node dtype
(2) ops semantics decide output dtype -> using output dtype
All `override_return_dtype` in https://github.com/pytorch/pytorch/blob/master/torch/_inductor/lowering.py.
(3) other ops: perform promote on input nodes dtype. ADD(BF16, FP32) -> FP32 output.

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@pytorchmergebot
Copy link
Collaborator

Successfully rebased gh/zhuhaozhe/10/orig onto refs/remotes/origin/viable/strict, please pull locally before adding more changes (for example, via ghstack checkout https://github.com/pytorch/pytorch/pull/98065)

pytorchmergebot pushed a commit that referenced this pull request Apr 17, 2023
ghstack-source-id: bdb4764
Pull Request resolved: #98065
Enable data type propagation in schedule node level.
Propagation policy:
(1) ops with dtype args [constant, load, rand, randn] -> direct use dtype as node dtype
(2) ops semantics decide output dtype -> using output dtype
All `override_return_dtype` in https://github.com/pytorch/pytorch/blob/master/torch/_inductor/lowering.py.
(3) other ops: perform promote on input nodes dtype. ADD(BF16, FP32) -> FP32 output.

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
zhuhaozhe added a commit that referenced this pull request Apr 17, 2023
ghstack-source-id: a4ce6a0
Pull Request resolved: #98065
@zhuhaozhe
Copy link
Collaborator Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@facebook-github-bot facebook-github-bot deleted the gh/zhuhaozhe/10/head branch June 8, 2023 19:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

6 participants