Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[quant][graphmode][fx][eagermode] Add leaky relu support in quantization workflows #45712

Closed
wants to merge 5 commits into from

Conversation

jerryzh168
Copy link
Contributor

@jerryzh168 jerryzh168 commented Oct 2, 2020

Stack from ghstack:

Summary:
Eager mode will still be able to use functional leaky relu, but it will be less accurate than
LeakyReLU module.
FX graph mode will support both leaky relu functional and module

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D24069961

…ion workflows

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
…n quantization workflows"

Summary:
Eager mode will still be able to use functional leaky relu, but it will be less accurate than
LeakyReLU module.
FX graph mode will support both leaky relu functional and module

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
…n quantization workflows"

Summary:
Eager mode will still be able to use functional leaky relu, but it will be less accurate than
LeakyReLU module.
FX graph mode will support both leaky relu functional and module

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@codecov
Copy link

codecov bot commented Oct 2, 2020

Codecov Report

Merging #45712 into gh/jerryzh168/450/base will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@                   Coverage Diff                   @@
##           gh/jerryzh168/450/base   #45712   +/-   ##
=======================================================
  Coverage                   68.32%   68.33%           
=======================================================
  Files                         410      410           
  Lines                       52994    52992    -2     
=======================================================
  Hits                        36210    36210           
+ Misses                      16784    16782    -2     
Impacted Files Coverage Δ
torch/quantization/quantization_mappings.py 85.10% <ø> (ø)
torch/quantization/fx/quantization_patterns.py 89.30% <100.00%> (-0.06%) ⬇️
torch/utils/benchmark/utils/common.py 96.40% <0.00%> (-0.72%) ⬇️
torch/testing/_internal/expecttest.py 78.57% <0.00%> (+1.02%) ⬆️
torch/nn/quantized/modules/activation.py 92.06% <0.00%> (+3.17%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f959fbd...37541fd. Read the comment docs.

…n quantization workflows"

Summary:
Eager mode will still be able to use functional leaky relu, but it will be less accurate than
LeakyReLU module.
FX graph mode will support both leaky relu functional and module

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24069961](https://our.internmc.facebook.com/intern/diff/D24069961)

[ghstack-poisoned]
…n quantization workflows"

Summary:
Eager mode will still be able to use functional leaky relu, but it will be less accurate than
LeakyReLU module.
FX graph mode will support both leaky relu functional and module

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D24069961](https://our.internmc.facebook.com/intern/diff/D24069961)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 0da6730.

@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/450/head branch October 10, 2020 14:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants