Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[quant][graphmode][fx][fix] Fix observer insert logic for ops that inherits quantization parameters for input #45473

Closed
wants to merge 9 commits into from

Conversation

jerryzh168
Copy link
Contributor

@jerryzh168 jerryzh168 commented Sep 29, 2020

Stack from ghstack:

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D23981312

…herits quantization parameters for input

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@dr-ci
Copy link

dr-ci bot commented Sep 29, 2020

💊 CI failures summary and remediations

As of commit 133bcad (more details on the Dr. CI page):


  • 10/10 failures possibly* introduced in this PR
    • 3/10 non-CircleCI failure(s)---

7 failures not recognized by patterns:

Job Step Action
CircleCI pytorch_linux_xenial_cuda10_2_cudnn7_py3_gcc7_test Run tests 🔁 rerun
CircleCI pytorch_macos_10_13_py3_test Update Homebrew 🔁 rerun
CircleCI pytorch_linux_bionic_py3_6_clang9_test Run tests 🔁 rerun
CircleCI pytorch_linux_bionic_py3_8_gcc9_coverage_test Run tests 🔁 rerun
CircleCI pytorch_linux_xenial_py3_clang5_asan_test2 Run tests 🔁 rerun
CircleCI pytorch_linux_xenial_py3_6_gcc5_4_ge_config_simple_test Run tests 🔁 rerun
CircleCI pytorch_linux_xenial_py3_6_gcc5_4_test Run tests 🔁 rerun

🚧 6 fixed upstream failures:

These were probably caused by upstream breakages that were already fixed.

Please rebase on the viable/strict branch (expand for instructions)

Since your merge base is older than viable/strict, run these commands:

git fetch https://github.com/pytorch/pytorch viable/strict
git rebase FETCH_HEAD

Check out the recency history of this "viable master" tracking branch.


Extra GitHub checks: 1 failed


ci.pytorch.org: 2 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 37 times.

…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Sep 29, 2020
…herits quantization parameters for input

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: d4033c47e1fd672ada0d7b4018c157e4677dc6e9
Pull Request resolved: #45473
…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Sep 29, 2020
…herits quantization parameters for input

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: eab591ce82e2fea5e6f54bb078d5a1bac0dfa8ab
Pull Request resolved: #45473
…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
env[node.name] = observed_graph.create_node('call_module', observer_name, (load_arg(node),), {})
activation_post_process_name_map[node.name] = observer_name
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be self.activation_post_process?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh this is storing the names of activation_post_process modules, since we need this information when we insert observer calls, and it's not stored anywhere else.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this can be a local variable since it's just needed when we process the graph

# use the same observer as input
observed_input = node.args[0]
if observed_input.name in activation_post_process_name_map:
print('inserting observer of input', observed_input.target)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Extra print statement?

Copy link
Contributor

@supriyar supriyar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left couple comments that need to be addressed.

…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
…ops that inherits quantization parameters for input"

Summary:
Currently we don't insert observer for the output, but we actually need to
use input observer for the output to be consistent with the transformation
we are doing.

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23981312](https://our.internmc.facebook.com/intern/diff/D23981312)

[ghstack-poisoned]
@jerryzh168
Copy link
Contributor Author

this is not needed anymore, we plan to observe the output for all the ops that inherits quantization parameters (e.g. average pool) and ops that currently have fixed quantization parameters (e.g. sigmoid).

@jerryzh168 jerryzh168 closed this Oct 5, 2020
@jerryzh168 jerryzh168 reopened this Oct 15, 2020
@jerryzh168 jerryzh168 closed this Oct 20, 2020
@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/443/head branch November 20, 2020 15:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants