Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[quant] Add quantized Sigmoid module #45883

Closed
wants to merge 3 commits into from

Conversation

jerryzh168
Copy link
Contributor

@jerryzh168 jerryzh168 commented Oct 6, 2020

Stack from ghstack:

Summary:

Test Plan:
python test/test_quantization.py TestStaticQuantizedModule.test_sigmoid

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D24129116

Summary:

Test Plan:
python test/test_quantization.py TestStaticQuantizedModule.test_sigmoid

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@codecov
Copy link

codecov bot commented Oct 6, 2020

Codecov Report

Merging #45883 into gh/jerryzh168/454/base will increase coverage by 0.00%.
The diff coverage is 83.33%.

Impacted file tree graph

@@                   Coverage Diff                   @@
##           gh/jerryzh168/454/base   #45883   +/-   ##
=======================================================
  Coverage                   68.33%   68.33%           
=======================================================
  Files                         410      410           
  Lines                       52997    53008   +11     
=======================================================
+ Hits                        36213    36222    +9     
- Misses                      16784    16786    +2     
Impacted Files Coverage Δ
torch/nn/quantized/modules/activation.py 90.54% <81.81%> (-1.53%) ⬇️
torch/nn/quantized/modules/__init__.py 97.22% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c0b9d44...f60b542. Read the comment docs.

zero_point: quantization zero point of the output tensor
"""

def __init__(self, output_scale: float, output_zero_point: int):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have default args? It is safe to assume those for this function

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure, it's not likely that user is going to construct quantized model from quantized modules from scratch anyways, what is the benefit of having default args?

Summary:

Test Plan:
python test/test_quantization.py TestStaticQuantizedModule.test_sigmoid

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24129116](https://our.internmc.facebook.com/intern/diff/D24129116)

[ghstack-poisoned]
Summary:

Test Plan:
python test/test_quantization.py TestStaticQuantizedModule.test_sigmoid

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24129116](https://our.internmc.facebook.com/intern/diff/D24129116)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 83d2c9a.

@@ -149,3 +149,24 @@ def _get_name(self):
def from_float(cls, mod):
scale, zero_point = mod.activation_post_process.calculate_qparams()
return cls(float(scale), int(zero_point), mod.negative_slope, mod.inplace)

class Sigmoid(torch.nn.Sigmoid):
r"""This is the quantized equivalent of :class:`~torch.nn.LeakyReLU`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The docs are off here, can we refer to sigmoid instead of leakyRelu?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh sorry, will fix in next PR

@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/454/head branch October 11, 2020 14:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants