New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[quant] Add quantized::leaky_relu that takes scale/zero_point as input #45702
Conversation
Summary: #45593 Previously quantized leaky_relu does not require observation and just inherits the quantization parameters from input, but that does not work very well in qat This PR added a quantized::leaky_relu that has observation for output and it will become the default leaky_relu that our quantization tools produce (eager/graph mode) Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
…int as input" Summary: #45593 Previously quantized leaky_relu does not require observation and just inherits the quantization parameters from input, but that does not work very well in qat This PR added a quantized::leaky_relu that has observation for output and it will become the default leaky_relu that our quantization tools produce (eager/graph mode) Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681) [ghstack-poisoned]
…int as input" Summary: #45593 Previously quantized leaky_relu does not require observation and just inherits the quantization parameters from input, but that does not work very well in qat This PR added a quantized::leaky_relu that has observation for output and it will become the default leaky_relu that our quantization tools produce (eager/graph mode) Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681) [ghstack-poisoned]
…int as input" Summary: #45593 Previously quantized leaky_relu does not require observation and just inherits the quantization parameters from input, but that does not work very well in qat This PR added a quantized::leaky_relu that has observation for output and it will become the default leaky_relu that our quantization tools produce (eager/graph mode) Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681) [ghstack-poisoned]
@@ -113,7 +113,7 @@ Tensor& leaky_relu_out_quantized_cpu(Tensor& result, const Tensor& self, | |||
return result; | |||
} | |||
|
|||
Tensor heaky_relu_quantized_cpu(const Tensor& self, Scalar negval) { | |||
Tensor leaky_relu_quantized_cpu(const Tensor& self, Scalar negval) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For my understanding: When is this called compared to QLeakyRelu::run?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is called by the original leaky_relu op that works on both float and quantized tensors and it does not need quantization parameter, QLeakyRelu::run is called by quantized::leaky_relu, which only works on quantized tensor and needs output quantization parameters
…int as input" Summary: #45593 Previously quantized leaky_relu does not require observation and just inherits the quantization parameters from input, but that does not work very well in qat This PR added a quantized::leaky_relu that has observation for output and it will become the default leaky_relu that our quantization tools produce (eager/graph mode) Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681) [ghstack-poisoned]
Codecov Report
@@ Coverage Diff @@
## gh/jerryzh168/448/base #45702 +/- ##
=======================================================
Coverage 68.32% 68.32%
=======================================================
Files 410 410
Lines 52981 52981
=======================================================
Hits 36200 36200
Misses 16781 16781 Continue to review full report at Codecov.
|
…int as input" Summary: #45593 Previously quantized leaky_relu does not require observation and just inherits the quantization parameters from input, but that does not work very well in qat This PR added a quantized::leaky_relu that has observation for output and it will become the default leaky_relu that our quantization tools produce (eager/graph mode) Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681) [ghstack-poisoned]
…nt as input Summary: Same changes as the stack for leaky_relu: #45702 Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
This pull request has been merged in d1fc155. |
…ut_zero_point as input" Summary: Same changes as the stack for leaky_relu: #45702 Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24129113](https://our.internmc.facebook.com/intern/diff/D24129113) [ghstack-poisoned]
…utput_scale/output_zero_point as input" Summary: Same changes as the stack for leaky_relu: #45702 Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24129113](https://our.internmc.facebook.com/intern/diff/D24129113) [ghstack-poisoned]
…ut_zero_point as input" Summary: Same changes as the stack for leaky_relu: #45702 Test Plan: Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D24129113](https://our.internmc.facebook.com/intern/diff/D24129113) [ghstack-poisoned]
Stack from ghstack:
Summary:
#45593
Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)
Test Plan:
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: D24067681