-
Notifications
You must be signed in to change notification settings - Fork 25.7k
ONNX Export All Cases of Softmax #18482
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@houseroad, @zrphercule, could you please take a look a this |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
Yeah, feel free to ping me :-) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we also add test cases for dtype argument?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for updating the PR. Please address the inline comments. Thanks!
| input = torch.ones(*dims, requires_grad=True) | ||
| self.run_model_test(model, train=False, batch_size=BATCH_SIZE, input=input) | ||
|
|
||
| def test_softmax(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we merge this test with the previous test (i.e., test_softmax_dim) or at least follow the style in the previous test. we can test the case which dim and axis have different semantics more thoroughly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I merged test_softmax_dim and test_softmax, but kept test_softmax_dtype separately, as we don't need to re-test each case with dtype
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
@houseroad merged this pull request in 1ec1db4. |
Summary: Update the softmax in onnx supported operators from `softmax (only dim=-1 supported)` to `softmax`, as all cases of dim options are supported in: [https://github.com/pytorch/pytorch/issues/18482](https://github.com/pytorch/pytorch/pull/18482): ONNX Export All Cases of Softmax Pull Request resolved: #24832 Differential Revision: D16896538 Pulled By: bddppq fbshipit-source-id: 284039ffa42f09b0043e95cfe9f17e1afde53814
No description provided.