Hello,
I try to do add a BatchedNMS_TRT node in a exported onnx model and create engine from it using trtexec. But I got this error from trtexec:
[12/09/2020-15:14:48] [I] [TRT] ModelImporter.cpp:135: No importer registered for op: BatchedNMS_TRT. Attempting to import as plugin.
[12/09/2020-15:14:48] [I] [TRT] builtin_op_importers.cpp:3770: Searching for plugin: BatchedNMS_TRT, plugin_version: 1, plugin_namespace:
[12/09/2020-15:14:48] [I] [TRT] builtin_op_importers.cpp:3787: Successfully created plugin: BatchedNMS_TRT
[12/09/2020-15:14:49] [E] [TRT] BatchedNMS_TRT: could not find any supported formats consistent with input/output data types
[12/09/2020-15:14:49] [E] [TRT] ../builder/cudnnBuilderGraphNodes.cpp (872) - Misc Error in reportPluginError: 0 (could not find any supported formats consistent with input/output data types)
[12/09/2020-15:14:49] [E] [TRT] ../builder/cudnnBuilderGraphNodes.cpp (872) - Misc Error in reportPluginError: 0 (could not find any supported formats consistent with input/output data types)
[12/09/2020-15:14:49] [E] Engine creation failed
[12/09/2020-15:14:49] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # trtexec --onnx=retinanet_r50_fpn_1x_coco_nonmns_inonnx.onnx --saveEngine=retinanet_r50_fpn_1x_coco_nonmns_inonnx.trt --maxBatch=1 --workspace=4000 --shapes=1x3x400x600
I don't understand why there could not find any supported formats consistent with input/output data types since inputs of batchednms_trt are boxes and scores and there are FLOAT32 as required.
Following is how I add the node in onnx:
import onnx
from onnx import helper
from onnx import AttributeProto, TensorProto, GraphProto
def add_batchednms(input_file, output_file):
attrs = {'shareLocation': True,
'numClasses': 81,
'backgroundLabelId': 80,
'topK':1000,
'keepTopK':100,
'scoreThreshold':0.05,
'iouThreshold':0.5,
'isNormalized':False,
'clipBoxes':False
}
model = onnx.load(input_file)
# create the BatchedNMS node
batched_nms_node = onnx.helper.make_node(
op_type='BatchedNMS_TRT',
name='BatchedNMS_TRT',
inputs=['boxes', 'scores'],
outputs=['num_detections', 'nmsed_boxes', 'nmsed_scores', 'nmsed_classes'],
**attrs
)
# add to the list of graph nodes
model.graph.node.append(batched_nms_node)
# not sure how to deterine the shape of these value infos
num_detections = onnx.helper.make_tensor_value_info('num_detections', TensorProto.INT32, [])
nmsed_boxes = onnx.helper.make_tensor_value_info('nmsed_boxes', TensorProto.FLOAT, [])
nmsed_scores = onnx.helper.make_tensor_value_info('nmsed_scores', TensorProto.FLOAT, [])
nmsed_classes = onnx.helper.make_tensor_value_info('nmsed_classes', TensorProto.INT32, [])
# remove old ouput of graph
output_nodes = [n for n in model.graph.output]
for old_node in output_nodes:
model.graph.output.remove(old_node)
# add new outputs
model.graph.output.append(num_detections)
model.graph.output.append(nmsed_boxes)
model.graph.output.append(nmsed_scores)
model.graph.output.append(nmsed_classes)
# check that it works and re-save
# onnx.checker.check_model(model)
onnx.save(model, output_file)
The modified onnx model seems fine.

Could anyone help me with this?
Thanks a lot.
Hello,
I try to do add a BatchedNMS_TRT node in a exported onnx model and create engine from it using trtexec. But I got this error from trtexec:
I don't understand why there
could not find any supported formats consistent with input/output data typessince inputs of batchednms_trt are boxes and scores and there are FLOAT32 as required.Following is how I add the node in onnx:
The modified onnx model seems fine.

Could anyone help me with this?
Thanks a lot.