Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ConvTranspose auto_pad and explicit pads combination should be invalid #2787

Closed
vuzelac-cadence opened this issue May 24, 2020 · 5 comments
Closed

Comments

@vuzelac-cadence
Copy link

vuzelac-cadence commented May 24, 2020

X = helper.make_tensor_value_info('X', TensorProto.FLOAT, [1, 1, 2, 2])
W = helper.make_tensor_value_info('W', TensorProto.FLOAT, [1, 1, 3, 3])
B = helper.make_tensor_value_info('B', TensorProto.FLOAT, [1])
Y = helper.make_tensor_value_info('Y', TensorProto.FLOAT, [1, 1, 3, 3])
node_def = helper.make_node("ConvTranspose", ["X", "W", "B"], ["Y"], auto_pad="SAME_UPPER", strides=[1, 1], pads=[0, 1, 1, 0])

The above excerpt doesn't generate error although the spec for ConvTranspose says it should:

pads : list of ints
_...This attribute cannot be used simultaneously with auto_pad attribute...

If they can be used together, auto_pad description hints pads are valid only if auto_pad=NONSET. However, I see pads attribute applies even when auto_pad=SAME_UPPER/SAME_LOWER
`

@TMVector
Copy link
Contributor

Hi @vuzelac-cadence, thanks for the report. The onnx.helper.make_node function has no operator-specific checks. It is the role of the ONNX checker to catch problems like this, but it is not exhaustive by any means. The specification has the final say on whether a node is a valid instantiation of an operator, and ONNX backends/compilers/tools should work with such nodes.
If you would like to add this check to the ONNX checker (if it isn't already there), then please submit a PR :)

@vuzelac-cadence
Copy link
Author

@TMVector , I failed to write the whole code but the checker is invoked and it passes, and that was the reason to submit this issue, not the helper code. Sorry about that. So, the check is not there.
Hope you can find time to describe better what this means " it is not exhaustive by any means".

IMO, it should run against everything in the spec, otherwise it's incomplete (or can simply say that it has bugs). Is it really that you intended it to be non-exhaustive ?

@TMVector
Copy link
Contributor

@vuzelac-cadence thanks for clarifying. Yes, the ONNX project is dependent on people and companies dedicating time to work on it, and unfortunately the checker is not complete, but even with lots of people working full time on it I'm sure it wouldn't be perfect :)
I believe it tries to cover the majority of issues by checking machine-readable properties of operators, such as accepted input types and attributes.

@faxu
Copy link

faxu commented Sep 3, 2020

CC @jcwchen

@jcwchen
Copy link
Member

jcwchen commented Sep 15, 2020

Solved by #3000. This error can be caught by onnx.checker.check_model(model, full_check=True). @vuzelac-cadence Thank you for reporting this.

@jcwchen jcwchen closed this as completed Sep 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants