Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Support clamp_min and clamp_max #37872

Closed
wants to merge 1 commit into from

Conversation

@BowenBao
Copy link
Contributor

BowenBao commented May 5, 2020

clamp_min is used in torch.nn.functional.normalize. Update symbolic_opset11 to support with updated clip in onnx opset 11.

@@ -1515,6 +1515,15 @@ def forward(self, x, k):
k = torch.tensor(3)
self.run_test(MyModuleDynamic(), [x, k])

@skipIfUnsupportedOpsetVersion([7, 12])

This comment has been minimized.

Copy link
@neginraoof

neginraoof May 6, 2020

Collaborator

Why is opset 12 skipped?

This comment has been minimized.

Copy link
@neginraoof

neginraoof May 6, 2020

Collaborator

I guess we can enable this after ORT version is updated?

Copy link
Member

houseroad left a comment

Thanks

Copy link
Contributor

facebook-github-bot left a comment

@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented May 7, 2020

@houseroad merged this pull request in 7be9796.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

6 participants
You can’t perform that action at this time.