We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can the quantized model trained by pytorch qat be converted to the onnx model?
No response
cc @jerryzh168 @jianyuh @raghuramank100 @jamesr66a @vkuzo
The text was updated successfully, but these errors were encountered:
Hi Xinhua, I believe #42835 adds support for this. You can see the PR description for an example of how to do this. You may also find this thread useful: https://discuss.pytorch.org/t/onnx-export-of-quantized-model/76884
Sorry, something went wrong.
closing due to inactivity
No branches or pull requests
馃殌 The feature, motivation and pitch
Can the quantized model trained by pytorch qat be converted to the onnx model?
Alternatives
No response
Additional context
No response
cc @jerryzh168 @jianyuh @raghuramank100 @jamesr66a @vkuzo
The text was updated successfully, but these errors were encountered: