Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add type annotations to torch.onnx.* modules #45258

Closed
wants to merge 8 commits into from
Closed

Add type annotations to torch.onnx.* modules #45258

wants to merge 8 commits into from

Conversation

guilhermeleobas
Copy link
Collaborator

Fixes #45215

Still need to resolve a few mypy issues before a review. In special, there is an error which I don't know how to solve, see:

torch/onnx/utils.py:437: error: Name 'is_originally_training' is not defined  [name-defined]
        if training is None or training == TrainingMode.EVAL or (training == TrainingMode.PRESERVE and not is_originally_training):

is_originally_training is used but never defined/imported on torch/onnx/utils.py,

@guilhermeleobas guilhermeleobas added the module: typing Related to mypy type annotations label Sep 24, 2020
@guilhermeleobas guilhermeleobas self-assigned this Sep 24, 2020
@dr-ci
Copy link

dr-ci bot commented Sep 24, 2020

💊 CI failures summary and remediations

As of commit 1b52305 (more details on the Dr. CI page):


  • 2/2 failures possibly* introduced in this PR
    • 2/2 non-CircleCI failure(s)

Extra GitHub checks: 1 failed


codecov.io: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 63 times.

@@ -76,7 +77,7 @@ def export(model, args, f, export_params=True, verbose=False, training=None,
if aten or export_raw_ir:
assert operator_export_type is None
assert aten ^ export_raw_ir
operator_export_type = OperatorExportTypes.ATEN if aten else OperatorExportTypes.RAW
operator_export_type = OperatorExportTypes.ONNX_ATEN if aten else OperatorExportTypes.RAW
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no mention to OperatorExportTypes.ATEN in the source code. So, I think this value was supposed to be OperatorExportTypes.ONNX_ATEN.

@@ -466,7 +468,7 @@ def export_to_pretty_string(model, args, f, export_params=True, verbose=False, t
if aten or export_raw_ir:
assert operator_export_type is None
assert aten ^ export_raw_ir
operator_export_type = OperatorExportTypes.ATEN if aten else OperatorExportTypes.RAW
operator_export_type = OperatorExportTypes.ONNX_ATEN if aten else OperatorExportTypes.RAW
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here

@@ -432,7 +434,7 @@ def _model_to_graph(model, args, verbose=False,
param_names = input_and_param_names[len(input_and_param_names) - len(params):]
params_dict = dict(zip(param_names, params))

if training is None or training == TrainingMode.EVAL or (training == TrainingMode.PRESERVE and not is_originally_training):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is_originally_training is not defined anywhere. So, I guess this part of the conditional isn't executed.

@guilhermeleobas guilhermeleobas marked this pull request as ready for review September 29, 2020 19:27
@zhangguanheng66 zhangguanheng66 added module: onnx Related to torch.onnx triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Sep 29, 2020
@rgommers rgommers self-requested a review October 2, 2020 17:41
Copy link
Collaborator

@rgommers rgommers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @guilhermeleobas

@rgommers
Copy link
Collaborator

rgommers commented Oct 2, 2020

This has a merge conflict, can you rebase @guilhermeleobas?

@guilhermeleobas
Copy link
Collaborator Author

guilhermeleobas commented Oct 7, 2020

Failures are real

torch/onnx/utils.py:203: error: Module has no attribute "_jit_pass_onnx_set_dynamic_input_shape"  [attr-defined]
torch/onnx/utils.py:225: error: Module has no attribute "_jit_pass_onnx_graph_shape_type_inference"; maybe "_jit_pass_onnx_node_shape_type_inference"?  [attr-defined]
torch/onnx/utils.py:429: error: Module has no attribute "_jit_pass_onnx_assign_output_shape"  [attr-defined]
torch/onnx/utils.py:435: error: Module has no attribute "_jit_pass_onnx_assign_output_shape"  [attr-defined]
torch/onnx/symbolic_opset11.py:806: error: Module has no attribute "_jit_pass_fixup_onnx_loop_node_inputs"; maybe "_jit_pass_fixup_onnx_controlflow_node"?  [attr-defined]

Edit: Fixed! CI should be all green now

@rgommers
Copy link
Collaborator

@guilhermeleobas there's still CI failures here that are real:

Traceback (most recent call last):
  File "test_type_hints.py", line 217, in test_run_mypy
    self.fail(f"mypy failed: {stdout} {stderr}")
AssertionError: mypy failed: torch/_C/__init__.pyi:375: error: Name 'Value' already defined on line 368  [no-redef]
torch/_C/__init__.pyi:379: error: Name 'Block' already defined on line 371  [no-redef]
torch/_C/__init__.pyi:383: error: Name 'Node' already defined on line 365  [no-redef]
Found 3 errors in 1 file (checked 1099 source files)

Can you update the PR?

@guilhermeleobas
Copy link
Collaborator Author

@rgommers those issues should now be fixed.

The current mypy failures are not related to any changes introduced in this PR and there are fixed (ignored) in a different PR

~/git/pytorch:torch-onnx (pytorch-cuda-dev) guilhermel@pytorch-dev $ mypy
torch/distributions/kl.py:107: error: Value of type variable "_LT" of "min" cannot be "_Match"  [type-var]
torch/distributions/kl.py:108: error: Value of type variable "_LT" of "min" cannot be "_Match"  [type-var]
torch/utils/cpp_extension.py:1560: error: Argument 2 to "load_module" has incompatible type "IO[Any]"; expected "Optional[_FileLike]"  [arg-type]
torch/testing/_internal/common_utils.py:1040: error: Value of type variable "_LT" of "max" cannot be "Optional[float]"  [type-var]

@rgommers
Copy link
Collaborator

The current mypy failures are not related to any changes introduced in this PR and there are fixed (ignored) in a different PR

Yes. Would be nice to get those fixes landed first and then re-run the CI on this PR, because the onnx_ort_test1/2 jobs would be good to see green.

@guilhermeleobas
Copy link
Collaborator Author

guilhermeleobas commented Oct 23, 2020

Failure is not related to this PR:

...
Oct 22 20:00:18 + cleanup
Oct 22 20:00:18 + retcode=0
Oct 22 20:00:18 + set +x
Oct 22 20:00:18 Error response from daemon: No such exec instance: 9d9da493a69d31cab48a6c46379347093d2a2fcf77e3a9a486be69535b1dac3d

Exited with code exit status 1

@rgommers
Copy link
Collaborator

LGTM now, thanks @guilhermeleobas.

@malfet would you be able to land this PR?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@ezyang merged this pull request in 40a2dd7.

@mruberry
Copy link
Collaborator

mruberry commented Dec 2, 2020

Unlanding. This appears to have broken multiple builds (see the Github CI, too). Sample snippet:

test_run_mypy - TestTypeHints
test_type_hints.py
Traceback (most recent call last):
  File "test_type_hints.py", line 217, in test_run_mypy
    self.fail(f"mypy failed: {stdout} {stderr}")
AssertionError: mypy failed: torch/_C/__init__.pyi:281: error: Name '_jit_pass_constant_propagation' already defined on line 206  [no-redef]
torch/onnx/symbolic_helper.py:368: error: Name 'unbind' already defined (possibly by an import)  [no-redef]
Found 2 errors in 2 files (checked 1157 source files)

@mruberry mruberry reopened this Dec 2, 2020
@guilhermeleobas
Copy link
Collaborator Author

@rgommers done!

@codecov
Copy link

codecov bot commented Dec 2, 2020

Codecov Report

Merging #45258 (1b52305) into master (9c6979a) will increase coverage by 0.00%.
The diff coverage is 72.41%.

@@           Coverage Diff           @@
##           master   #45258   +/-   ##
=======================================
  Coverage   80.83%   80.83%           
=======================================
  Files        1859     1859           
  Lines      200633   200639    +6     
=======================================
+ Hits       162177   162182    +5     
- Misses      38456    38457    +1     

@ezyang
Copy link
Contributor

ezyang commented Dec 2, 2020

@guilhermeleobas could you please reopen a new PR? I cna't import this one anymore

@guilhermeleobas
Copy link
Collaborator Author

@ezyang, sure!

facebook-github-bot pushed a commit that referenced this pull request Dec 7, 2020
Summary:
Fixes #45215

This is a follow up PR of #45258

Pull Request resolved: #48782

Reviewed By: heitorschueroff

Differential Revision: D25304229

Pulled By: ezyang

fbshipit-source-id: b01b21ddbf86f908ca08173e68b81fb25851bc81
@guilhermeleobas guilhermeleobas mentioned this pull request Dec 8, 2020
facebook-github-bot pushed a commit that referenced this pull request Dec 9, 2020
Summary:
Fixes #45215

This is a follow up PR of #45258 and #48782

Pull Request resolved: #48980

Reviewed By: zhangguanheng66

Differential Revision: D25399823

Pulled By: ezyang

fbshipit-source-id: 798055f4abbbffecdfab0325884193c81addecec
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged module: onnx Related to torch.onnx module: typing Related to mypy type annotations open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Enable torch.onnx.* typechecks during CI
8 participants