New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[quantization] Fix BC for quantized linear #30481
Conversation
[ghstack-poisoned]
ghstack-source-id: edbefa8a5bb27e7e79da5c56e04c7ca7d233e441 Pull Request resolved: #30481
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LG
@@ -136,6 +137,14 @@ def _load_from_state_dict(self, state_dict, prefix, local_metadata, strict, | |||
self.zero_point = int(state_dict[prefix + 'zero_point']) | |||
state_dict.pop(prefix + 'zero_point') | |||
|
|||
version = local_metadata.get('version', None) | |||
if version is None or version == 1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For my understanding: Do we need to check for version ==1? We dont have this field currently.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's better to be explicit, for example in the forward-compatiblity case. It would be an error to enter this branch if someone's loading a newer model, and that would lead to some pretty weird errors
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Newer models will have version >= 2 right?
Summary: Pull Request resolved: pytorch#30481 Test Plan: Imported from OSS Differential Revision: D18714602 Pulled By: jamesr66a fbshipit-source-id: d51206c22cf2446e98053446789c6324c0481321
Stack from ghstack:
Differential Revision: D18714602