You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Can't concat ragged tensors with more than one dimension.
I want to decode results from the object detector. I am using tf.ragged.boolean_mask(bboxes, pos_inds) to gather results above threshold but my detector has multiple scales like yolo family so I need to concatenate it to one tensor to perform nonmax suppression. There's is no problem with scores because is a 1d tensor but it fails to convert bounding boxes.
That feature would allow avoiding builder plate code for decoding object detection results because the decoding step would be inside onnx model.
System information
OS Platform and Distribution: Ubuntu 21.04
Tensorflow Version: 2.7.0
Python version: 3.8.8
To Reproduce
i1=Input((None, 4), ragged=True, batch_size=1)
i2=Input((None, 4), ragged=True, batch_size=1)
# None in shape above means that i have unknow number of detection per scale, 4 is for bounding boxo=tf.concat([i1, i2], axis=1)
m=Model([i1, i2], o)
m.save("model")
python -m tf2onnx.convert --saved-model model --output model.onnx --opset 15
Now the ONNX doesn't support ragged tensors. Converter can convert rank-1 ragged tensors but currently can't do larger than rank-1.
We could potentially support higher ranks with a loop, but that would probably not be very efficient now.
Describe the bug
Can't concat ragged tensors with more than one dimension.
I want to decode results from the object detector. I am using
tf.ragged.boolean_mask(bboxes, pos_inds)
to gather results above threshold but my detector has multiple scales like yolo family so I need to concatenate it to one tensor to perform nonmax suppression. There's is no problem with scores because is a 1d tensor but it fails to convert bounding boxes.That feature would allow avoiding builder plate code for decoding object detection results because the decoding step would be inside onnx model.
System information
To Reproduce
Error
The text was updated successfully, but these errors were encountered: