-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Supporting negative axes for all existing onnx ops #2281
Conversation
…gativeAxes # Conflicts: # docs/Changelog.md # onnx/defs/operator_sets.h # onnx/defs/tensor/defs.cc
What's the relation between #2207 and this PR? |
@wschin I just saw this PR. Looks like it does not increase the version number, and it' missing tests for negative axes. This PR also includes most recent ops. I can merge that PR and make sure those fixes are included here. |
Please do. Can you also rev the Split opset for consistency? Negative axis support was added there recently without that happening (#2177). |
cc @BowenBao @spandantiwari to review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
@@ -620,6 +644,30 @@ class OpSet_Onnx_ver11 { | |||
fn(GetOpSchema<ONNX_OPERATOR_SET_SCHEMA_CLASS_NAME(Onnx, 11, Det)>()); | |||
fn(GetOpSchema<ONNX_OPERATOR_SET_SCHEMA_CLASS_NAME(Onnx, 11, ScatterND)>()); | |||
fn(GetOpSchema<ONNX_OPERATOR_SET_SCHEMA_CLASS_NAME(Onnx, 11, GatherND)>()); | |||
fn(GetOpSchema<ONNX_OPERATOR_SET_SCHEMA_CLASS_NAME(Onnx, 11, Slice)>()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
An FYI comment for all - It was discussed during the SIG/Operators WG in ONNX workshop that it is safer option to bump up the opset version of these ops now that the spec is clear that negative axes are allowed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM. It's really a great work. You even noticed things I was not aware of. Here are some required changes (although it's not harmful even if you don't make them).
- There are some typos. In many places, the input tensor's name is
data
instead ofinput
. - We need some tests (https://github.com/onnx/onnx/blob/master/onnx/test/shape_inference_test.py) for shape inference with negative axes.
Because this PR significantly improves ONNX, I want to just approve it.
const TensorShapeProto& input_shape = | ||
ctx.getInputType(0)->tensor_type().shape(); | ||
int r = input_shape.dim_size(); | ||
if (r != 2) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this check does not follow the definition of Softmax
? The document of Softmax
says "Input does not need to explicitly be a 2D vector; rather, it will be coerced into one".
* Added negative axes for slice and squeeze opset 11 * added negative axes support for squeeze, unsqueeze, flatten * added support for negative axes to all the existing ops * fixed minor if condition missed for axis attr in flatten * fixed test name for flatten with negative axes * updated unsqueeze and softmax tests with fix for failures * fixed typo * Updating Split op documentations and version * fixed typo in unsqueeze model * fixed dim check for unsqueeze * fixed type cast * test fix for build failure * updating onnx model for unsqueeze test * fixed minor error in type casting
I modified the ops documentations as well as tests, to add support for negative axes