Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Enable tests for opset 12 #37846

Closed
wants to merge 12 commits into from
Closed

Conversation

@neginraoof
Copy link
Collaborator

neginraoof commented May 5, 2020

Update ORT nightly version and enable opset 12 tests.

neginraoof added 2 commits May 1, 2020
@dr-ci
Copy link

dr-ci bot commented May 5, 2020

💊 CI failures summary and remediations

As of commit b792f48 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker.

See how this bot performed.

This comment has been revised 21 times.

neginraoof added 2 commits May 5, 2020
@neginraoof neginraoof changed the title [WIP ONNX] Test opset 12 [WIP ONNX] Enable tests for opset 12 May 5, 2020
neginraoof added 3 commits May 5, 2020
@neginraoof neginraoof changed the title [WIP ONNX] Enable tests for opset 12 [ONNX] Enable tests for opset 12 May 6, 2020
neginraoof added 2 commits May 6, 2020
@@ -1498,7 +1482,11 @@ def forward(self, x):
def test_topk_smallest_unsorted(self):
class MyModule(torch.nn.Module):
def forward(self, x, k):
return torch.topk(x, k, largest=False, sorted=False)
# When sorted=False, order of elements in the outout tensors

This comment has been minimized.

Copy link
@neginraoof

neginraoof May 6, 2020

Author Collaborator

This is a change in behavior in the newest ORT build, but it is expected (not a bug).

This comment has been minimized.

Copy link
@lara-hdr

lara-hdr May 7, 2020

Member

are we planning on supporting this case?

This comment has been minimized.

Copy link
@neginraoof

neginraoof May 7, 2020

Author Collaborator

We do support this case, but since the output tensor is not sorted (sorted=False), the order of elements in ORT output tensor is different from PyTorch output. This is expected.

Copy link
Member

lara-hdr left a comment

There is a merge conflict, but LGTM

@@ -1498,7 +1482,11 @@ def forward(self, x):
def test_topk_smallest_unsorted(self):
class MyModule(torch.nn.Module):
def forward(self, x, k):
return torch.topk(x, k, largest=False, sorted=False)
# When sorted=False, order of elements in the outout tensors

This comment has been minimized.

Copy link
@lara-hdr

lara-hdr May 7, 2020

Member

are we planning on supporting this case?

neginraoof added 2 commits May 7, 2020
Copy link
Contributor

BowenBao left a comment

LGTM

@neginraoof
Copy link
Collaborator Author

neginraoof commented May 7, 2020

@houseroad Can you please review this?
Thanks

Copy link
Member

houseroad left a comment

thanks

Copy link
Contributor

facebook-github-bot left a comment

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented May 8, 2020

@houseroad merged this pull request in 29f19bf.

malfet added a commit to malfet/pytorch that referenced this pull request Jun 11, 2020
seemethere pushed a commit that referenced this pull request Jun 11, 2020
* [ONNX] Fix pow op export (#38065)

Summary:
Fix pow type cast for opset 9 and update opset 12
Pull Request resolved: #38065

Differential Revision: D21485353

Pulled By: malfet

fbshipit-source-id: 3993e835ffad07b2e6585eb5cf1cb7c8474de2ec

* Update ort-nighly version as suggested in #39685 (comment)

* Apply changes from #37846 to  `test_topk_smallest_unsorted`

Co-authored-by: neginraoof <neginmr@utexas.edu>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

6 participants
You can’t perform that action at this time.