Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Enable onnx shape inference in export by default #46629

Closed
wants to merge 22 commits into from

Conversation

BowenBao
Copy link
Collaborator

@BowenBao BowenBao commented Oct 21, 2020

  • Enable ONNX shape inference by default.
  • ONNX could potentially set inferred shape in output instead of value_infos, checking both to be sure.
  • Small fix in symbol_map to avoid overlooking dup symbols.
  • Fix scalar_type_analysis to be consistent with PyTorch scalar type promotion logic.
  • Correctly handle None dim_param from ONNX inferred shape.

@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Oct 21, 2020
@dr-ci
Copy link

dr-ci bot commented Oct 21, 2020

💊 CI failures summary and remediations

As of commit f7eae4c (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 76 times.

@zou3519 zou3519 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Oct 21, 2020
@BowenBao BowenBao force-pushed the onnx_inf_from_output branch 2 times, most recently from 1f490b3 to b586368 Compare October 28, 2020 16:23
@BowenBao BowenBao changed the title [ONNX] shape inference to check graph output for inferred shape as well [ONNX] Enable onnx shape inference in export by default Oct 28, 2020
@codecov
Copy link

codecov bot commented Oct 29, 2020

Codecov Report

Merging #46629 (f7eae4c) into master (eb8331e) will increase coverage by 0.04%.
The diff coverage is 72.85%.

@@            Coverage Diff             @@
##           master   #46629      +/-   ##
==========================================
+ Coverage   81.25%   81.30%   +0.04%     
==========================================
  Files        1838     1838              
  Lines      198270   198318      +48     
==========================================
+ Hits       161114   161241     +127     
+ Misses      37156    37077      -79     

@facebook-github-bot
Copy link
Contributor

Hi @BowenBao!

Thank you for your pull request. We require contributors to sign our Contributor License Agreement, and yours needs attention.

You currently have a record in our system, but we do not have a signature on file.

In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks!

@neginraoof
Copy link
Contributor

I'm not clear about this item:
ONNX could potentially set inferred shape in output instead of value_infos, checking both to be sure.

Is there any other mechanism to specify shape type other than value_infos? What does it mean to check both?

@BowenBao
Copy link
Collaborator Author

BowenBao commented Nov 5, 2020

I'm not clear about this item:
ONNX could potentially set inferred shape in output instead of value_infos, checking both to be sure.

Is there any other mechanism to specify shape type other than value_infos? What does it mean to check both?

Yes, changes in ONNX 1.8 will write inferred shape for outputs of graph to graph outputs. inferred shape for outputs of intermediate node will still be kept in value_infos.

torch/onnx/utils.py Outdated Show resolved Hide resolved
Copy link
Contributor

@neginraoof neginraoof left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. THanks!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bzinodev has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bzinodev has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bzinodev has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@bzinodev merged this pull request in 6a4d55f.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged oncall: jit Add this issue/PR to JIT oncall triage queue open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants