-
Notifications
You must be signed in to change notification settings - Fork 22.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixing ONNX export of logical ops to have correct output datatype #15185
Fixing ONNX export of logical ops to have correct output datatype #15185
Conversation
The failed check seems to be unrelated (test run timed out). |
Seeing the same issue again. There's a timeout reported. Also, an assert in blob_test is coming up, but I am not sure if that's the failing test and it seems unrelated. Could someone please take a look and advise? |
@spandantiwari no worries, I triggered the rebuild manually. |
@houseroad - thanks a lot! |
@houseroad - this one failed quite quickly. Looks like some network issue. |
Yeah, seems some problem with the docker image. No worries, I think we can ignore it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@houseroad has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Hi @houseroad, is this PR ok to merge? |
@ezyang it may break some internal pipeline, let me double check. Will send you the feedback soon. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thank.
Looks like this broke a test:
|
@ezyang interesting, the ci didn't catch it. @spandantiwari do you mind to update the expect file and resubmit the PR? Thanks. |
@ houseroad - yes, I can do that. I think the CI did not catch it because torch.ne support went in another PR very recently. So this branch might be stale. I can update it. |
No worries, feel free to open a new pr. Thanks |
OK will do. |
@houseroad - I have a new PR #15677 that updates the export of torch.ne and fixes the above test failure. Please take a look when you get a chance. |
…utput datatype (#15677) Summary: This is the an updated version of the earlier PR #15185, since that one was closed. Currently PyTorch ONNX exporter exports the logical ops (lt, gt, le, ge, eq, ne) with output type in corresponding ONNX ops as type tensor(uint8). But ONNX spec allows for only tensor(bool), which is why models that have these ops fail to load properly. This issue is captured in #11339. Part of this issue, relating to the allowed input types, has been fixed in ONNX spec by houseroad. This PR fixes the other part pertaining to output type. Pull Request resolved: #15677 Reviewed By: dzhulgakov Differential Revision: D13568450 Pulled By: houseroad fbshipit-source-id: a6afbea1afdb4edad8f8b1bc492f50b14e5f2fce
Currently PyTorch ONNX exporter exports the logical ops (
lt
,gt
,le
,ge
,eq
) with output type in corresponding ONNX ops as typetensor(uint8)
. But ONNX spec allows for onlytensor(bool)
, which is why models that have these ops fail to load properly.This issue is captured in #11339. Part of this issue, relating to the allowed input types, has been fixed in ONNX spec by @houseroad. This PR fixes the other part pertaining to output type.