Skip to content

Conversation

raghuramank100
Copy link
Contributor

@raghuramank100 raghuramank100 commented May 21, 2020

Stack from ghstack:

Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
raghuramank100 pushed a commit that referenced this pull request May 21, 2020
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

ghstack-source-id: 104494648
Pull Request resolved: #38851
@dr-ci
Copy link

dr-ci bot commented May 21, 2020

💊 CI failures summary and remediations

As of commit 7f159ae (more details on the Dr. CI page):


  • 1/1 failures possibly* introduced in this PR
    • 1/1 non-CircleCI failure(s)

ci.pytorch.org: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 129 times.

raghuramank100 pushed a commit that referenced this pull request May 22, 2020
Pull Request resolved: #38851


ghstack-source-id: 104570469

Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
# Check that module matches the numerics of the op and ensure that module can be
# instantiated for all engines and dtypes

for qengine in supported_qengines:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can use the @override_qengines decorator instead of these two lines.

Copy link
Contributor

@supriyar supriyar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we also add a test at op level for quantized lstm? That tests for different input shapes and layers? I don't see one in test_quantized_op.py

Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
raghuramank100 pushed a commit that referenced this pull request May 29, 2020
Pull Request resolved: #38851


ghstack-source-id: 104924757

Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
raghuramank100 pushed a commit that referenced this pull request Jun 10, 2020
Pull Request resolved: #38851


ghstack-source-id: 105587461

Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
@raghuramank100 raghuramank100 requested a review from apaszke as a code owner June 15, 2020 01:43
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
raghuramank100 pushed a commit that referenced this pull request Jun 15, 2020
Create three tests for LSTMs:
1. test_qlstm: Test to check numerics of quantized LSTM operator.
2. test_lstm_api: To check the LSTM module and compare
it with the quantized LSTM op
3. test_quantized_rnn: Check the dynamic quantization workflow, scriptability and serialization of quantized
LSTM
Pull Request resolved: #38851


ghstack-source-id: 105864340

Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Copy link

@z-a-f z-a-f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

if dtype == torch.qint8:
packed_ih = torch.ops.quantized.linear_prepack(Wq1, b1)
packed_hh = torch.ops.quantized.linear_prepack(Wq2, b2)
cell_params = torch.ops.quantized.make_quantized_cell_params_dynamic(packed_ih, packed_hh, b1, b2)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should set reduce_range True here.


ref_out, ref_hid = ref(x, hiddens)

for qengine in supported_qengines:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use @override_qengines decorator here?


def test_quantized_rnn(self):
r"""Test execution and serialization for dynamic quantized lstm modules on int8 and fp16
r"""Test dynamic quantization, scriptability and serialization for dynamic quantized lstm modules on int8 and fp16
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a test below test_per_channel_lstm_quantize for per-channel quantization. Should we combine the two here and go over both qconfigs?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I will combine the tests

def _get_rnn_inputs(seq_len, num_batches, input_size, hidden_size, num_directions):
# For Input (seq_len, batch, input_size)
X = torch.randn(seq_len, num_batches, input_size)
# TODO: Change to reduce_range=True once support is enabled
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is supported now

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have it enabled in the next PR in the stack, will move the change to this PR itself.

H = torch.zeros(num_directions, num_batches, hidden_size)
C = torch.zeros(num_directions, num_batches, hidden_size)

s, z = _calculate_dynamic_qparams(H, torch.quint8, reduce_range=False)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change reduce_range to true here


s, z = _calculate_dynamic_qparams(H, torch.quint8, reduce_range=False)
Hq = torch.quantize_per_tensor(H, s, z, torch.quint8)
s, z = _calculate_dynamic_qparams(C, torch.quint8, reduce_range=False)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, have made all the changes requested.

Copy link
Contributor

@supriyar supriyar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a few comments that I think should be addressed before submitting.

Copy link

@z-a-f z-a-f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing the differential approval

Copy link

@z-a-f z-a-f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing phabricator

Copy link

@z-a-f z-a-f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing the phabricator

Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Copy link

@z-a-f z-a-f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing phabricator

Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
Current LSTM tests do not check for scriptability. We also do not test that the module is consistent with the LSTM op. Modified the tests and refactored them to fit into the new testing structure
Differential Revision: [D21628596](https://our.internmc.facebook.com/intern/diff/D21628596/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 655f1ea.

xwang233 pushed a commit to xwang233/pytorch that referenced this pull request Jun 20, 2020
Summary:
Create three tests for LSTMs:
1. test_qlstm: Test to check numerics of quantized LSTM operator.
2. test_lstm_api: To check the LSTM module and compare
it with the quantized LSTM op
3. test_quantized_rnn: Check the dynamic quantization workflow, scriptability and serialization of quantized
LSTM
Pull Request resolved: pytorch#38851

ghstack-source-id: 105945574

(Note: this ignores all push blocking failures!)

Test Plan:
buck test caffe2/test:quantization -- 'test_lstm_api \(quantization\.test_quantized_module\.TestDynamicQuantizedModule\)' --print-passing-details

buck test caffe2/test:quantization -- 'test_quantized_rnn \(quantization\.test_quantize\.TestPostTrainingDynamic\)'

buck test caffe2/test:quantization -- 'test_qlstm \(quantization\.test_quantized_op\.TestDynamicQuantizedRNNOp\)' --print-passing-details

Differential Revision: D21628596

fbshipit-source-id: 4aeda899f2e5f14bfbe3d82096cb4ce89c725fa1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants