Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

onnx to tensorrt NonMaxSuppression #523

Closed
zhouyingchaoAI opened this issue Apr 30, 2020 · 8 comments
Closed

onnx to tensorrt NonMaxSuppression #523

zhouyingchaoAI opened this issue Apr 30, 2020 · 8 comments
Labels
ONNX triaged Issue has been triaged by maintainers

Comments

@zhouyingchaoAI
Copy link

Description

ERROR: /opt/tensorrt/TensorRT/parsers/onnx/ModelImporter.cpp:134 In function parseGraph:
No importer registered for op: NonMaxSuppression

Environment

TensorRT Version: 7.0
GPU Type: 2080ti
Nvidia Driver Version:
CUDA Version: 10.2
CUDNN Version: 7.6
Operating System + Version: ubuntu18.04
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable): 1.15
PyTorch Version (if applicable): 1.3
Baremetal or Container (if container which image + tag):

Relevant Files

Steps To Reproduce

@rgc183
Copy link

rgc183 commented Apr 30, 2020

I tried writing a layer in ModelImporter:

DEFINE_BUILTIN_OP_IMPORTER(NonMaxSuppression)
{
    // NonMaxSuppression is not supported opset below 10.
    ASSERT(ctx->getOpsetVersion() >= 10, ErrorCode::kUNSUPPORTED_NODE);

    nvinfer1::ITensor* boxes_tensor = &convertToTensor(inputs.at(0), ctx);
    nvinfer1::ITensor* scores_tensor = &convertToTensor(inputs.at(1), ctx);
    const int numInputs = inputs.size();
    LOG_ERROR("no of inputs are "<<numInputs);
    LOG_ERROR("node outsize and op type are "<<node.output().size()<< " type " << node.op_type());

   const auto scores_dims = scores_tensor->getDimensions();
    const auto boxes_dims = boxes_tensor->getDimensions();
    LOG_ERROR("boxes dims "<< boxes_dims.nbDims << " dim3 has size "<<boxes_dims.d[2]);
       const std::string pluginName = "BatchedNMS_TRT";
    const std::string pluginVersion = "1";
    std::vector<nvinfer1::PluginField> f;

    bool share_location = true;
    const bool is_normalized = true;
    const bool clip_boxes = true;
    int backgroundLabelId = 0;
   // Initialize.
    f.emplace_back("shareLocation", &share_location, nvinfer1::PluginFieldType::kINT8, 1);
    f.emplace_back("isNormalized", &is_normalized, nvinfer1::PluginFieldType::kINT8, 1);
    f.emplace_back("clipBoxes", &clip_boxes, nvinfer1::PluginFieldType::kINT8, 1);
    f.emplace_back("backgroundLabelId", &backgroundLabelId, nvinfer1::PluginFieldType::kINT32, 1);
    // Create plugin from registry
    nvinfer1::IPluginV2* plugin = importPluginFromRegistry(ctx, pluginName, pluginVersion, node.name(), f);

    ASSERT(plugin != nullptr && "NonMaxSuppression plugin was not found in the plugin registry!",
        ErrorCode::kUNSUPPORTED_NODE);

    std::vector<nvinfer1::ITensor*> nms_inputs ={boxes_tensor, scores_tensor};
    RETURN_FIRST_OUTPUT(ctx->network()->addPluginV2(nms_inputs.data(), nms_inputs.size(), *plugin));
}

There is no existing layer/plugin in tensorrt which implements NonMaxSuppression. The closest one is BatchedNMS_TRT. However, when I try to run the above code, it crashes at:

nvinfer1::plugin::BatchedNMSPlugin::getOutputDimensions() where it fails for the ASSERT(inputs[0].nbDims == 3); however, in DEFINE_BUILTIN_OP_IMPORTER(NonMaxSuppression) in my above function, it prints inputs[0].nbDims = 3. Can someone help me out here? Why assertion fails in getOutputDimensions().

@gabrielibagon
Copy link

gabrielibagon commented Jun 24, 2020

Hi @rgc183, I'm working on a similar problem - this code seems like a good start.

You should reference the documentation for the BatchedNMS_TRT Plugin, which describes the inputs/outputs of the plugin.

Your code may need to reshape the inputs and outputs to match what the plugin expects. I'm not sure why the BatchNMSPlugin.cpp is asserting inputs[0].nbDims = 3, when the documentation includes the batch dimension. Maybe the plugin automatically adds the batch dimension.

For what it's worth, I've been using this PR version of the plugin, which converts batchedNMS to IPluginV2DynamicExt and asserts inputs[0].nbDims = 4, which matches the documentation.

I hope this helps - let me know if you make progress on this issue.

@bnascimento
Copy link

Hi everyone, I’ve been following this thread because I have the same issue. Basically need to register a NonMaximumSupression operation on onnx-tensorrt.
Did as suggested and wrote on builtin_op_importers.cpp the following:

DEFINE_BUILTIN_OP_IMPORTER(NonMaxSuppression)
{
        // NonMaxSuppression is not supported opset below 10.
        ASSERT(ctx->getOpsetVersion() >= 10, ErrorCode::kUNSUPPORTED_NODE);

        nvinfer1::ITensor* boxes_tensor = &convertToTensor(inputs.at(0), ctx);
        nvinfer1::ITensor* scores_tensor = &convertToTensor(inputs.at(1), ctx);
        const int numInputs = inputs.size();
        LOG_ERROR("no of inputs are "<<numInputs);
        LOG_ERROR("node outsize and op type are "<<node.output().size()<< " type " << node.op_type());

        const auto scores_dims = scores_tensor->getDimensions();
        const auto boxes_dims = boxes_tensor->getDimensions();
        LOG_ERROR("boxes dims "<< boxes_dims.nbDims << " dim3 has size "<<boxes_dims.d[2]);
        const std::string pluginName = "BatchedNMS_TRT";
        const std::string pluginVersion = "1";
        std::vector<nvinfer1::PluginField> f;

        bool share_location = true;
        const bool is_normalized = true;
        const bool clip_boxes = true;
        int backgroundLabelId = 0;
        // Initialize.
        f.emplace_back("shareLocation", &share_location, nvinfer1::PluginFieldType::kINT8, 1);
        f.emplace_back("isNormalized", &is_normalized, nvinfer1::PluginFieldType::kINT8, 1);
        f.emplace_back("clipBoxes", &clip_boxes, nvinfer1::PluginFieldType::kINT8, 1);
        f.emplace_back("backgroundLabelId", &backgroundLabelId, nvinfer1::PluginFieldType::kINT32, 1);
        // Create plugin from registry
        // nvinfer1::IPluginV2* plugin = importPluginFromRegistry(ctx, pluginName, pluginVersion, node.name(), f);
        nvinfer1::IPluginV2* plugin = createPlugin(node.name(), importPluginCreator(pluginName, pluginVersion), f);

        ASSERT(plugin != nullptr && "NonMaxSuppression plugin was not found in the plugin registry!",
                   ErrorCode::kUNSUPPORTED_NODE);

        std::vector<nvinfer1::ITensor*> nms_inputs ={boxes_tensor, scores_tensor};
        RETURN_FIRST_OUTPUT(ctx->network()->addPluginV2(nms_inputs.data(), nms_inputs.size(), *plugin));
}

Trying to optimize the model from .onnx to .trt with:
onnx2trt /tmp/export/saved_model/updated_model.onnx -o /tmp/export/saved_model/model.trt

I got the following error:

[2020-10-27 16:19:27   ERROR] /opt/onnx-tensorrt/builtin_op_importers.cpp:122: no of inputs are 5
[2020-10-27 16:19:27   ERROR] /opt/onnx-tensorrt/builtin_op_importers.cpp:123: node outsize and op type are 1 type NonMaxSuppression
[2020-10-27 16:19:27   ERROR] /opt/onnx-tensorrt/builtin_op_importers.cpp:127: boxes dims 3 dim3 has size 4
#assertion/home/jenkins/workspace/OSS/L0_MergeRequest/oss/plugin/batchedNMSPlugin/batchedNMSPlugin.cpp,77
Aborted (core dumped)

Anyone with a solid advice to get past this issue?
Thanks

@ttyio
Copy link
Collaborator

ttyio commented Feb 8, 2021

@zhouyingchaoAI , thanks for reporting.
Internal feature request created for NonMaxSuppression support.

@ttyio ttyio added ONNX Release: 7.x triaged Issue has been triaged by maintainers labels Feb 8, 2021
@Alex-EEE
Copy link

Would like to know when this feature ships

@ttyio
Copy link
Collaborator

ttyio commented Apr 30, 2021

I will close this one, before we have official NonMaxSuppression support, there is a good reference to check in #795, thanks all!

@ttyio ttyio closed this as completed Apr 30, 2021
@chienkan
Copy link
Contributor

I tried writing a layer in ModelImporter:

DEFINE_BUILTIN_OP_IMPORTER(NonMaxSuppression)
{
    // NonMaxSuppression is not supported opset below 10.
    ASSERT(ctx->getOpsetVersion() >= 10, ErrorCode::kUNSUPPORTED_NODE);

    nvinfer1::ITensor* boxes_tensor = &convertToTensor(inputs.at(0), ctx);
    nvinfer1::ITensor* scores_tensor = &convertToTensor(inputs.at(1), ctx);
    const int numInputs = inputs.size();
    LOG_ERROR("no of inputs are "<<numInputs);
    LOG_ERROR("node outsize and op type are "<<node.output().size()<< " type " << node.op_type());

   const auto scores_dims = scores_tensor->getDimensions();
    const auto boxes_dims = boxes_tensor->getDimensions();
    LOG_ERROR("boxes dims "<< boxes_dims.nbDims << " dim3 has size "<<boxes_dims.d[2]);
       const std::string pluginName = "BatchedNMS_TRT";
    const std::string pluginVersion = "1";
    std::vector<nvinfer1::PluginField> f;

    bool share_location = true;
    const bool is_normalized = true;
    const bool clip_boxes = true;
    int backgroundLabelId = 0;
   // Initialize.
    f.emplace_back("shareLocation", &share_location, nvinfer1::PluginFieldType::kINT8, 1);
    f.emplace_back("isNormalized", &is_normalized, nvinfer1::PluginFieldType::kINT8, 1);
    f.emplace_back("clipBoxes", &clip_boxes, nvinfer1::PluginFieldType::kINT8, 1);
    f.emplace_back("backgroundLabelId", &backgroundLabelId, nvinfer1::PluginFieldType::kINT32, 1);
    // Create plugin from registry
    nvinfer1::IPluginV2* plugin = importPluginFromRegistry(ctx, pluginName, pluginVersion, node.name(), f);

    ASSERT(plugin != nullptr && "NonMaxSuppression plugin was not found in the plugin registry!",
        ErrorCode::kUNSUPPORTED_NODE);

    std::vector<nvinfer1::ITensor*> nms_inputs ={boxes_tensor, scores_tensor};
    RETURN_FIRST_OUTPUT(ctx->network()->addPluginV2(nms_inputs.data(), nms_inputs.size(), *plugin));
}

There is no existing layer/plugin in tensorrt which implements NonMaxSuppression. The closest one is BatchedNMS_TRT. However, when I try to run the above code, it crashes at:

nvinfer1::plugin::BatchedNMSPlugin::getOutputDimensions() where it fails for the ASSERT(inputs[0].nbDims == 3); however, in DEFINE_BUILTIN_OP_IMPORTER(NonMaxSuppression) in my above function, it prints inputs[0].nbDims = 3. Can someone help me out here? Why assertion fails in getOutputDimensions().

Hi, how did you work around the problem since I've met the same one?

@tft-robert
Copy link

Any update on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ONNX triaged Issue has been triaged by maintainers
Projects
None yet
Development

No branches or pull requests

8 participants