Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Use optional op to keep None in results for ONNX internal tests #84789

Closed
wants to merge 13 commits into from

Conversation

titaiwangms
Copy link
Collaborator

@titaiwangms titaiwangms commented Sep 9, 2022

Stack from ghstack (oldest at bottom):

All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

cc @EikanWang @jgong5 @wenzhe-nrv @sanchitintel

@pytorch-bot
Copy link

pytorch-bot bot commented Sep 9, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/84789

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit fe480ce:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: onnx torch.onnx related changes that should show up in the release notes label Sep 9, 2022
@facebook-github-bot facebook-github-bot added cla signed oncall: jit Add this issue/PR to JIT oncall triage queue labels Sep 9, 2022
@titaiwangms titaiwangms changed the title [ONNX] Use optional op to keep None in results for internal tests [WIP][ONNX] Use optional op to keep None in results for internal tests Sep 9, 2022
@titaiwangms titaiwangms added the module: onnx Related to torch.onnx label Sep 9, 2022
@titaiwangms titaiwangms linked an issue Sep 9, 2022 that may be closed by this pull request
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Sep 9, 2022
ghstack-source-id: 0c21a08c9ffe1a389d743d7c5246254381fdb5b2
Pull Request resolved: #84789
@titaiwangms titaiwangms changed the title [WIP][ONNX] Use optional op to keep None in results for internal tests [ONNX] Use optional op to keep None in results for internal tests Sep 10, 2022
…l tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Sep 15, 2022
ghstack-source-id: 2c539bc8431413af56007110fb8139ca9619771e
Pull Request resolved: #84789
int opset_version);

// Replace None in output with Optional node (opset > 15) if it's
// script model. This helps align the output format in internal tests.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please clarify "internal tests". Which tests?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@titaiwangms titaiwangms changed the title [ONNX] Use optional op to keep None in results for internal tests [ONNX] Use optional op to keep None in results for ONNX internal tests Sep 23, 2022
@justinchuby justinchuby mentioned this pull request Sep 25, 2022
28 tasks
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Sep 27, 2022
ghstack-source-id: 2090c4a36b66b105314fd326a6c1a733d41af9ba
Pull Request resolved: #84789
@facebook-github-bot
Copy link
Contributor

/easycla

As part of the transition to the PyTorch Foundation, this project now requires contributions be covered under the new CLA. See #85559 for additional details.

This comment will trigger a new check of this PR. If you are already covered, you will simply see a new "EasyCLA" check that passes. If you are not covered, a bot will leave a new comment with a link to sign.

@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Oct 4, 2022

CLA Signed

The committers listed above are authorized under a signed CLA.

…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Oct 5, 2022
ghstack-source-id: 6ba264eea7f048efe595c154a1ffc7ee474963d2
Pull Request resolved: #84789
@titaiwangms titaiwangms closed this Oct 6, 2022
@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Oct 25, 2022
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Oct 25, 2022
ghstack-source-id: f2bfa2003b57db97c1823794f9ad94a213b547ef
Pull Request resolved: #84789
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Dec 5, 2022
ghstack-source-id: 320ca8ea5751bb8cb967f86beb67ccb1bf7dbd93
Pull Request resolved: #84789
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Dec 5, 2022
ghstack-source-id: d7980fbee2b1f42973b32374cdfdd0007767c50c
Pull Request resolved: #84789
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Dec 5, 2022
ghstack-source-id: 224c02ec64790b4fe8451c1487f8b7d2910b3d19
Pull Request resolved: #84789
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Dec 6, 2022
ghstack-source-id: eb70de774a659b9d7e3f1ae7fa5fae71c8ae60f4
Pull Request resolved: #84789
@titaiwangms titaiwangms mentioned this pull request Dec 6, 2022
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Dec 6, 2022
ghstack-source-id: eb70de774a659b9d7e3f1ae7fa5fae71c8ae60f4
Pull Request resolved: #84789
@github-actions
Copy link

github-actions bot commented Feb 5, 2023

Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as Stale.
Feel free to remove the Stale label if you feel this was a mistake.
If you are unable to remove the Stale label please contact a maintainer in order to do so.
If you want the bot to never mark this PR stale again, add the no-stale label.
Stale pull requests will automatically be closed after 30 days of inactivity.

@github-actions github-actions bot added the Stale label Feb 5, 2023
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

cc EikanWang jgong5 wenzhe-nrv sanchitintel

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Feb 5, 2023
ghstack-source-id: 0a2e05bc716a76e9f5edb25e686a3548f9ecc300
Pull Request resolved: #84789
@titaiwangms titaiwangms added no-stale and removed Stale labels Feb 5, 2023
…ternal tests"


All this time, PyTorch and ONNX has different strategy for None in output. And in internal test, we flatten the torch outputs to see if the rest of them matched. However, this doesn't work anymore in scripting after Optional node is introduced, since some of None would be kept.

#83184 forces script module to keep all Nones from Pytorch, but in ONNX, the model only keeps the ones generated with Optional node, and deletes those meaningless None.

This PR uses Optional node to keep those meaningless None in output as well, so when it comes to script module result comparison, Pytorch and ONNX should have the same amount of Nones.

cc EikanWang jgong5 wenzhe-nrv sanchitintel

[ghstack-poisoned]
titaiwangms added a commit that referenced this pull request Feb 8, 2023
ghstack-source-id: 4272457a55ebdb618e4028406295fb5c8bb540db
Pull Request resolved: #84789
@titaiwangms
Copy link
Collaborator Author

@pytorchbot merge -g

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@titaiwangms titaiwangms added the topic: improvements topic category label Feb 21, 2023
@facebook-github-bot facebook-github-bot deleted the gh/AllenTiTaiWang/8/head branch June 8, 2023 14:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/trunk Trigger trunk jobs on your pull request cla signed Merged module: onnx Related to torch.onnx no-stale oncall: jit Add this issue/PR to JIT oncall triage queue open source release notes: onnx torch.onnx related changes that should show up in the release notes topic: improvements topic category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[ONNX] ONNX output ignores None while Pytorch doesn't.
6 participants