New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fail fast when dynamo attempts to add unspecialized int/float as additional graph inputs #96786
Conversation
|
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/96786
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit d9ebc99: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D44075910 |
This pull request was exported from Phabricator. Differential Revision: D44075910 |
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Differential Revision: D44075910 fbshipit-source-id: abe0295fc1f2e448961ce7a0040dcd94a7744bc2
a8e99a1
to
c31cc97
Compare
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Differential Revision: D44075910 fbshipit-source-id: c3166f4517c02aaeba057fe5d452829c127ee251
c31cc97
to
98c98d6
Compare
This pull request was exported from Phabricator. Differential Revision: D44075910 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D44075910 |
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Differential Revision: D44075910 fbshipit-source-id: 1e79caf59c74c6559e524e5a49bba057b2e2eb8b
98c98d6
to
1bd01f4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this does not do the right thing for constant source. You only want to error IF we would have added a GraphArg. So push the assert lower down to the point where we actually add a grapharg. In particular, if you have a constant source you will not actually add a graph arg.
This pull request was exported from Phabricator. Differential Revision: D44075910 |
1bd01f4
to
e96b7d3
Compare
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Reviewed By: tugsbayasgalan Differential Revision: D44075910 fbshipit-source-id: 968562938c8ea1fd3e065e2ee162687cbb3112fc
This pull request was exported from Phabricator. Differential Revision: D44075910 |
e96b7d3
to
c1e9d8b
Compare
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Reviewed By: tugsbayasgalan Differential Revision: D44075910 fbshipit-source-id: a69d272f6dfcf6fa46662369b856eb6848decd5a
…tional graph inputs Pull Request resolved: #96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(#95621). However, with #95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Differential Revision: [D44075910](https://our.internmc.facebook.com/intern/diff/D44075910/) [ghstack-poisoned]
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Reviewed By: tugsbayasgalan Differential Revision: D44075910 fbshipit-source-id: 6ed54ee91a210a9f8a5ca1548d6fc90d51c0d20f
c1e9d8b
to
d402494
Compare
This pull request was exported from Phabricator. Differential Revision: D44075910 |
d402494
to
03b06cc
Compare
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Reviewed By: tugsbayasgalan Differential Revision: D44075910 fbshipit-source-id: fc20f634cb70aa24aca0caaaeca180b1b56d0889
This pull request was exported from Phabricator. Differential Revision: D44075910 |
…tional graph inputs (pytorch#96786) Summary: Pull Request resolved: pytorch#96786 Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621). However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Reviewed By: tugsbayasgalan Differential Revision: D44075910 fbshipit-source-id: b7a0920eca34f25e5a5183242278219dbe5761eb
This pull request was exported from Phabricator. Differential Revision: D44075910 |
03b06cc
to
d9ebc99
Compare
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…tional graph inputs (#96786) Summary: Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch/pytorch#95621). However, with #95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Differential Revision: D44075910 Pull Request resolved: pytorch/pytorch#96786 Approved by: https://github.com/tugsbayasgalan, https://github.com/ezyang
…tional graph inputs (#96786) Summary: Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch/pytorch#95621). However, with #95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs Test Plan: Example of the error message would look like: ``` Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps')) ``` Passed all export tests ``` Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26 Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257 RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8 Up: 6.0 KiB Down: 0 B Jobs completed: 101. Time elapsed: 115.7s. Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed ``` Differential Revision: D44075910 Pull Request resolved: pytorch/pytorch#96786 Approved by: https://github.com/tugsbayasgalan, https://github.com/ezyang
Summary:
Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(#95621).
However, with #95621 the issue to be solved originally is no longer valid because int & float in
forward
will always be specialized in export. This RP is to add the assertion anyway (though not be hit unless there is a regression) to immediately catch the attempt to add unspecialized int/float to additional graphargsTest Plan:
Example of the error message would look like:
Passed all export tests
Differential Revision: D44075910
cc @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @chenyang78 @aakhundov @soumith @desertfire