Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[quant] Add quantized::leaky_relu that takes scale/zero_point as input #45702

Closed
wants to merge 6 commits into from

Conversation

jerryzh168
Copy link
Contributor

@jerryzh168 jerryzh168 commented Oct 1, 2020

Stack from ghstack:

Summary:
#45593

Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D24067681

Summary:
#45593

Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
…int as input"

Summary:
#45593

Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681)

[ghstack-poisoned]
…int as input"

Summary:
#45593

Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681)

[ghstack-poisoned]
…int as input"

Summary:
#45593

Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681)

[ghstack-poisoned]
@@ -113,7 +113,7 @@ Tensor& leaky_relu_out_quantized_cpu(Tensor& result, const Tensor& self,
return result;
}

Tensor heaky_relu_quantized_cpu(const Tensor& self, Scalar negval) {
Tensor leaky_relu_quantized_cpu(const Tensor& self, Scalar negval) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For my understanding: When is this called compared to QLeakyRelu::run?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is called by the original leaky_relu op that works on both float and quantized tensors and it does not need quantization parameter, QLeakyRelu::run is called by quantized::leaky_relu, which only works on quantized tensor and needs output quantization parameters

…int as input"

Summary:
#45593

Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681)

[ghstack-poisoned]
@codecov
Copy link

codecov bot commented Oct 5, 2020

Codecov Report

Merging #45702 into gh/jerryzh168/448/base will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@                   Coverage Diff                   @@
##           gh/jerryzh168/448/base   #45702   +/-   ##
=======================================================
  Coverage                   68.32%   68.32%           
=======================================================
  Files                         410      410           
  Lines                       52981    52981           
=======================================================
  Hits                        36200    36200           
  Misses                      16781    16781           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 162717e...762545a. Read the comment docs.

…int as input"

Summary:
#45593

Previously quantized leaky_relu does not require observation and just inherits
the quantization parameters from input, but that does not work very well in qat
This PR added a quantized::leaky_relu that has observation for output and it will
become the default leaky_relu that our quantization tools produce (eager/graph mode)

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24067681](https://our.internmc.facebook.com/intern/diff/D24067681)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Oct 6, 2020
…nt as input

Summary:
Same changes as the stack for leaky_relu: #45702

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in d1fc155.

jerryzh168 added a commit that referenced this pull request Oct 6, 2020
…ut_zero_point as input"

Summary:
Same changes as the stack for leaky_relu: #45702

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24129113](https://our.internmc.facebook.com/intern/diff/D24129113)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Oct 7, 2020
…utput_scale/output_zero_point as input"

Summary:
Same changes as the stack for leaky_relu: #45702

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24129113](https://our.internmc.facebook.com/intern/diff/D24129113)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Oct 7, 2020
…ut_zero_point as input"

Summary:
Same changes as the stack for leaky_relu: #45702

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24129113](https://our.internmc.facebook.com/intern/diff/D24129113)

[ghstack-poisoned]
facebook-github-bot pushed a commit that referenced this pull request Oct 7, 2020
…nt as input (#45882)

Summary:
Pull Request resolved: #45882

Same changes as the stack for leaky_relu: #45702

Test Plan: Imported from OSS

Reviewed By: z-a-f

Differential Revision: D24129113

fbshipit-source-id: a26da33f877d3bdeea1976b69b2bd9369c2bf196
@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/448/head branch October 10, 2020 14:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants