Skip to content

Conversation

@lara-hdr
Copy link
Contributor

No description provided.

@lara-hdr
Copy link
Contributor Author

@houseroad, @zrphercule, could you please take a look a this

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@houseroad
Copy link
Member

Yeah, feel free to ping me :-)

Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we also add test cases for dtype argument?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for updating the PR. Please address the inline comments. Thanks!

input = torch.ones(*dims, requires_grad=True)
self.run_model_test(model, train=False, batch_size=BATCH_SIZE, input=input)

def test_softmax(self):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we merge this test with the previous test (i.e., test_softmax_dim) or at least follow the style in the previous test. we can test the case which dim and axis have different semantics more thoroughly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I merged test_softmax_dim and test_softmax, but kept test_softmax_dtype separately, as we don't need to re-test each case with dtype

@houseroad houseroad dismissed their stale review April 2, 2019 23:56

Problem addressed

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@houseroad merged this pull request in 1ec1db4.

facebook-github-bot referenced this pull request Aug 19, 2019
Summary:
Update the softmax in onnx supported operators from `softmax (only dim=-1 supported)` to `softmax`, as all cases of dim options are supported in:
[https://github.com/pytorch/pytorch/issues/18482](https://github.com/pytorch/pytorch/pull/18482): ONNX Export All Cases of Softmax
Pull Request resolved: #24832

Differential Revision: D16896538

Pulled By: bddppq

fbshipit-source-id: 284039ffa42f09b0043e95cfe9f17e1afde53814
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants