Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Update squeeze test for opset 9 #45369

Closed
wants to merge 3 commits into from

Conversation

BowenBao
Copy link
Collaborator

Only under static axes does opset 9 supports no-op squeeze when dim is not 1.
Updating the test case where it was setting dynamic axes.

@BowenBao BowenBao added the module: onnx Related to torch.onnx label Sep 26, 2020
@dr-ci
Copy link

dr-ci bot commented Sep 26, 2020

💊 CI failures summary and remediations

As of commit bc02b94 (more details on the Dr. CI page):


  • 2/2 failures possibly* introduced in this PR
    • 2/2 non-CircleCI failure(s)

Extra GitHub checks: 1 failed


codecov.io: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 5 times.

@gchanan gchanan added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Sep 28, 2020
Copy link
Contributor

@neginraoof neginraoof left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's a related test failure.
How was this test passing before this update?

@BowenBao
Copy link
Collaborator Author

BowenBao commented Oct 1, 2020

There's a related test failure.
How was this test passing before this update?

After looking, i think it's the similar issue of runtime shape inference within if subblock. The change in the test makes input sizes static instead of dynamic, so shape inference is able to check the dim and errors it shouldn't happen as dim != 1, though in reality it won't since it's inside the if block which conditions on dim != 1

However i think for this one I should update the symbolic. Going forward with shape inference we should be safe to consider any static shape being static at runtime as well. Dynamic ones should be inferred as dynamic, or marked as such by using dynamic_axes argument. Thus for this particular case it should be exported as no-op.

@codecov
Copy link

codecov bot commented Oct 3, 2020

Codecov Report

Merging #45369 into master will decrease coverage by 0.01%.
The diff coverage is 0.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #45369      +/-   ##
==========================================
- Coverage   68.50%   68.48%   -0.02%     
==========================================
  Files         408      408              
  Lines       52484    52493       +9     
==========================================
- Hits        35952    35951       -1     
- Misses      16532    16542      +10     
Impacted Files Coverage Δ
torch/onnx/symbolic_opset11.py 21.03% <0.00%> (-0.40%) ⬇️
torch/testing/_internal/expecttest.py 77.55% <0.00%> (-1.03%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 77cd8e0...bc02b94. Read the comment docs.

Copy link
Contributor

@neginraoof neginraoof left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bzinodev has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@bzinodev merged this pull request in b28b5d3.

facebook-github-bot pushed a commit that referenced this pull request Nov 9, 2020
Summary:
- rand/randn: the type signature of int[] is different in scripting, thus failing the check.
- where: scripting produces dynamic cases which are supported by `unbind` export of higher opsets.
- test_list_pass: this test fails when using new scripting api, should be fixed by #45369

Pull Request resolved: #45793

Reviewed By: mrshenli

Differential Revision: D24566096

Pulled By: bzinodev

fbshipit-source-id: 6fe0925c66dee342106d71c9cbc3c95cabe639f7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: onnx Related to torch.onnx open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants