Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fail fast when dynamo attempts to add unspecialized int/float as additional graph inputs #96786

Closed
wants to merge 1 commit into from

Conversation

guangy10
Copy link
Contributor

@guangy10 guangy10 commented Mar 15, 2023

Summary:
Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(#95621).

However, with #95621 the issue to be solved originally is no longer valid because int & float in forward will always be specialized in export. This RP is to add the assertion anyway (though not be hit unless there is a regression) to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:

Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))

Passed all export tests

Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed

Differential Revision: D44075910

cc @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @chenyang78 @aakhundov @soumith @desertfire

@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Mar 15, 2023

CLA Signed

The committers listed above are authorized under a signed CLA.

  • ✅ login: guangy10 / name: Guang Yang (a8e99a1)

@pytorch-bot
Copy link

pytorch-bot bot commented Mar 15, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/96786

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit d9ebc99:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

guangy10 added a commit to guangy10/pytorch that referenced this pull request Mar 15, 2023
…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Differential Revision: D44075910

fbshipit-source-id: abe0295fc1f2e448961ce7a0040dcd94a7744bc2
guangy10 added a commit to guangy10/pytorch that referenced this pull request Mar 15, 2023
…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Differential Revision: D44075910

fbshipit-source-id: c3166f4517c02aaeba057fe5d452829c127ee251
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

guangy10 added a commit to guangy10/pytorch that referenced this pull request Mar 15, 2023
…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Differential Revision: D44075910

fbshipit-source-id: 1e79caf59c74c6559e524e5a49bba057b2e2eb8b
Copy link
Contributor

@tugsbayasgalan tugsbayasgalan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this does not do the right thing for constant source. You only want to error IF we would have added a GraphArg. So push the assert lower down to the point where we actually add a grapharg. In particular, if you have a constant source you will not actually add a graph arg.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

guangy10 added a commit to guangy10/pytorch that referenced this pull request Mar 16, 2023
…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Reviewed By: tugsbayasgalan

Differential Revision: D44075910

fbshipit-source-id: 968562938c8ea1fd3e065e2ee162687cbb3112fc
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

guangy10 added a commit to guangy10/pytorch that referenced this pull request Mar 16, 2023
…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Reviewed By: tugsbayasgalan

Differential Revision: D44075910

fbshipit-source-id: a69d272f6dfcf6fa46662369b856eb6848decd5a
guangy10 pushed a commit that referenced this pull request Mar 16, 2023
…tional graph inputs

Pull Request resolved: #96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(#95621).

However, with #95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Differential Revision: [D44075910](https://our.internmc.facebook.com/intern/diff/D44075910/)

[ghstack-poisoned]
guangy10 added a commit to guangy10/pytorch that referenced this pull request Mar 16, 2023
…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Reviewed By: tugsbayasgalan

Differential Revision: D44075910

fbshipit-source-id: 6ed54ee91a210a9f8a5ca1548d6fc90d51c0d20f
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

guangy10 added a commit to guangy10/pytorch that referenced this pull request Mar 17, 2023
…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Reviewed By: tugsbayasgalan

Differential Revision: D44075910

fbshipit-source-id: fc20f634cb70aa24aca0caaaeca180b1b56d0889
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

…tional graph inputs (pytorch#96786)

Summary:
Pull Request resolved: pytorch#96786

Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch#95621).

However, with pytorch#95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Reviewed By: tugsbayasgalan

Differential Revision: D44075910

fbshipit-source-id: b7a0920eca34f25e5a5183242278219dbe5761eb
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D44075910

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Mar 17, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

cyyever pushed a commit to cyyever/pytorch_private that referenced this pull request Mar 23, 2023
…tional graph inputs (#96786)

Summary:
Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch/pytorch#95621).

However, with #95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Differential Revision: D44075910

Pull Request resolved: pytorch/pytorch#96786
Approved by: https://github.com/tugsbayasgalan, https://github.com/ezyang
cyyever pushed a commit to cyyever/pytorch_private that referenced this pull request Mar 27, 2023
…tional graph inputs (#96786)

Summary:
Verified the changes to catch unspecialized int/floats being added as additional graph in D44037548 prior to RP(pytorch/pytorch#95621).

However, with #95621 the issue to be solved originally is no longer valid because int & float in `forward` will always be specialized in export. This RP is to add the assertion anyway *(though not be hit unless there is a regression)* to immediately catch the attempt to add unspecialized int/float to additional graphargs

Test Plan:
Example of the error message would look like:
```
Dynamo attempts to add additional input: value=9.999999747378752e-06, source=NNModuleSource(inner=AttrSource(base=NNModuleSource(inner=AttrSource(base=LocalInputSource(local_name='self', pos=0), member='torch_module')), member='eps'))
```
Passed all export tests
```
Buck UI: https://www.internalfb.com/buck2/fea72653-5549-47e7-a9bf-740eb86a8e26
Test UI: https://www.internalfb.com/intern/testinfra/testrun/8725724422167257
RE: reSessionID-7b3470b1-c293-4c4a-9671-dd0b7a2839b8  Up: 6.0 KiB  Down: 0 B
Jobs completed: 101. Time elapsed: 115.7s.
Tests finished: Pass 98. Fail 0. Fatal 0. Skip 0. 0 builds failed
```

Differential Revision: D44075910

Pull Request resolved: pytorch/pytorch#96786
Approved by: https://github.com/tugsbayasgalan, https://github.com/ezyang
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants