Skip to content

Conversation

andrewor14
Copy link
Contributor

@andrewor14 andrewor14 commented Aug 11, 2022

Stack from ghstack (oldest at bottom):

Summary: Previously torch.nn.quantizable.LSTM was not scriptable
due to (1) the use of asterisk to unpack arguments, and (2) some
arguments being Optional, which was not understood by setitem.
This commit resolves both of these issues, enabling LSTM quantized
through custom modules to work with TorchScript.

Test Plan:
python test/test_quantization.py TestQuantizedOps.test_custom_module_lstm

Reviewers: jerryzh168, z-a-f

Subscribers: jerryzh168, z-a-f, supriyar

Tasks:
#83211
#75042

Summary: Previously `torch.nn.quantizable.LSTM` was not scriptable
due to (1) the use of asterisk to unpack arguments, and (2) some
arguments being Optional, which was not understood by setitem.
This commit resolves both of these issues, enabling LSTM quantized
through custom modules to work with TorchScript.

Test Plan:
python test/test_quantization.py TestQuantizedOps.test_custom_module_lstm

Reviewers: jerryzh, z-a-f

Subscribers: jerryzh, z-a-f, supriyar

Tasks:
#83211
#75042

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Aug 11, 2022

🔗 Helpful links

✅ No Failures (0 Pending)

As of commit 7f5745b (more details on the Dr. CI page):

Expand to see more

💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

Summary: Previously `torch.nn.quantizable.LSTM` was not scriptable
due to (1) the use of asterisk to unpack arguments, and (2) some
arguments being Optional, which was not understood by setitem.
This commit resolves both of these issues, enabling LSTM quantized
through custom modules to work with TorchScript.

Test Plan:
python test/test_quantization.py TestQuantizedOps.test_custom_module_lstm

Reviewers: jerryzh168, z-a-f

Subscribers: jerryzh168, z-a-f, supriyar

Tasks:
#83211
#75042

[ghstack-poisoned]
@andrewor14 andrewor14 requested review from z-a-f and jerryzh168 August 11, 2022 22:23
Summary: Previously `torch.nn.quantizable.LSTM` was not scriptable
due to (1) the use of asterisk to unpack arguments, and (2) some
arguments being Optional, which was not understood by setitem.
This commit resolves both of these issues, enabling LSTM quantized
through custom modules to work with TorchScript.

Test Plan:
python test/test_quantization.py TestQuantizedOps.test_custom_module_lstm

Reviewers: jerryzh168, z-a-f

Subscribers: jerryzh168, z-a-f, supriyar

Tasks:
#83211
#75042

[ghstack-poisoned]
Summary: Previously `torch.nn.quantizable.LSTM` was not scriptable
due to (1) the use of asterisk to unpack arguments, and (2) some
arguments being Optional, which was not understood by setitem.
This commit resolves both of these issues, enabling LSTM quantized
through custom modules to work with TorchScript.

Test Plan:
python test/test_quantization.py TestQuantizedOps.test_custom_module_lstm

Reviewers: jerryzh168, z-a-f

Subscribers: jerryzh168, z-a-f, supriyar

Tasks:
#83211
#75042

[ghstack-poisoned]
andrewor14 added a commit that referenced this pull request Aug 11, 2022
Summary: Previously `torch.nn.quantizable.LSTM` was not scriptable
due to (1) the use of asterisk to unpack arguments, and (2) some
arguments being Optional, which was not understood by setitem.
This commit resolves both of these issues, enabling LSTM quantized
through custom modules to work with TorchScript.

Test Plan:
python test/test_quantization.py TestQuantizedOps.test_custom_module_lstm

Reviewers: jerryzh168, z-a-f

Subscribers: jerryzh168, z-a-f, supriyar

Tasks:
#83211
#75042

ghstack-source-id: ee4e281
Pull Request resolved: #83304
@albanD albanD removed their request for review August 12, 2022 13:24
@andrewor14
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here.
The merge job was triggered without a flag. This means that your change will be merged once all checks on your PR have passed (ETA: 0-4 Hours). If this is not the intended behavior, feel free to use some of the other merge options in the wiki.
Please reach out to the PyTorch DevX Team with feedback or questions!

@github-actions
Copy link
Contributor

Hey @andrewor14.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

@andrewor14 andrewor14 added release notes: quantization release notes category topic: improvements topic category labels Aug 12, 2022
facebook-github-bot pushed a commit that referenced this pull request Aug 12, 2022
Summary:
Previously `torch.nn.quantizable.LSTM` was not scriptable
due to (1) the use of asterisk to unpack arguments, and (2) some
arguments being Optional, which was not understood by setitem.
This commit resolves both of these issues, enabling LSTM quantized
through custom modules to work with TorchScript.

Pull Request resolved: #83304
Approved by: https://github.com/jerryzh168

Test Plan:
contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/f81b4ae55cf4d9b44641178f31fd713f65d5af2e

Test plan from GitHub:
python test/test_quantization.py TestQuantizedOps.test_custom_module_lstm

Reviewed By: atalman

Differential Revision: D38658671

Pulled By: andrewor14

fbshipit-source-id: 5091e6e1edc91f82bb26bc91b020b9997b17e07e
@facebook-github-bot facebook-github-bot deleted the gh/andrewor14/27/head branch August 16, 2022 14:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants