-
Notifications
You must be signed in to change notification settings - Fork 25.5k
Refactor LSTM tests #38851
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor LSTM tests #38851
Conversation
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) ghstack-source-id: 104494648 Pull Request resolved: #38851
💊 CI failures summary and remediationsAs of commit 7f159ae (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 129 times. |
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Pull Request resolved: #38851 ghstack-source-id: 104570469 Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
# Check that module matches the numerics of the op and ensure that module can be | ||
# instantiated for all engines and dtypes | ||
|
||
for qengine in supported_qengines: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can use the @override_qengines decorator instead of these two lines.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we also add a test at op level for quantized lstm? That tests for different input shapes and layers? I don't see one in test_quantized_op.py
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Pull Request resolved: #38851 ghstack-source-id: 104924757 Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Pull Request resolved: #38851 ghstack-source-id: 105587461 Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Create three tests for LSTMs: 1. test_qlstm: Test to check numerics of quantized LSTM operator. 2. test_lstm_api: To check the LSTM module and compare it with the quantized LSTM op 3. test_quantized_rnn: Check the dynamic quantization workflow, scriptability and serialization of quantized LSTM Pull Request resolved: #38851 ghstack-source-id: 105864340 Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
if dtype == torch.qint8: | ||
packed_ih = torch.ops.quantized.linear_prepack(Wq1, b1) | ||
packed_hh = torch.ops.quantized.linear_prepack(Wq2, b2) | ||
cell_params = torch.ops.quantized.make_quantized_cell_params_dynamic(packed_ih, packed_hh, b1, b2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should set reduce_range
True here.
test/quantization/test_quantize.py
Outdated
|
||
ref_out, ref_hid = ref(x, hiddens) | ||
|
||
for qengine in supported_qengines: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we use @override_qengines
decorator here?
|
||
def test_quantized_rnn(self): | ||
r"""Test execution and serialization for dynamic quantized lstm modules on int8 and fp16 | ||
r"""Test dynamic quantization, scriptability and serialization for dynamic quantized lstm modules on int8 and fp16 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a test below test_per_channel_lstm_quantize
for per-channel quantization. Should we combine the two here and go over both qconfigs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I will combine the tests
def _get_rnn_inputs(seq_len, num_batches, input_size, hidden_size, num_directions): | ||
# For Input (seq_len, batch, input_size) | ||
X = torch.randn(seq_len, num_batches, input_size) | ||
# TODO: Change to reduce_range=True once support is enabled |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is supported now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have it enabled in the next PR in the stack, will move the change to this PR itself.
H = torch.zeros(num_directions, num_batches, hidden_size) | ||
C = torch.zeros(num_directions, num_batches, hidden_size) | ||
|
||
s, z = _calculate_dynamic_qparams(H, torch.quint8, reduce_range=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change reduce_range
to true here
|
||
s, z = _calculate_dynamic_qparams(H, torch.quint8, reduce_range=False) | ||
Hq = torch.quantize_per_tensor(H, s, z, torch.quint8) | ||
s, z = _calculate_dynamic_qparams(C, torch.quint8, reduce_range=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, have made all the changes requested.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a few comments that I think should be addressed before submitting.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Testing the differential approval
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Testing phabricator
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Testing the phabricator
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Testing phabricator
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/) [ghstack-poisoned]
This pull request has been merged in 655f1ea. |
Summary: Create three tests for LSTMs: 1. test_qlstm: Test to check numerics of quantized LSTM operator. 2. test_lstm_api: To check the LSTM module and compare it with the quantized LSTM op 3. test_quantized_rnn: Check the dynamic quantization workflow, scriptability and serialization of quantized LSTM Pull Request resolved: pytorch#38851 ghstack-source-id: 105945574 (Note: this ignores all push blocking failures!) Test Plan: buck test caffe2/test:quantization -- 'test_lstm_api \(quantization\.test_quantized_module\.TestDynamicQuantizedModule\)' --print-passing-details buck test caffe2/test:quantization -- 'test_quantized_rnn \(quantization\.test_quantize\.TestPostTrainingDynamic\)' buck test caffe2/test:quantization -- 'test_qlstm \(quantization\.test_quantized_op\.TestDynamicQuantizedRNNOp\)' --print-passing-details Differential Revision: D21628596 fbshipit-source-id: 4aeda899f2e5f14bfbe3d82096cb4ce89c725fa1
Stack from ghstack:
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: D21628596