Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[YoloV7] Blob Converter failed (Onnx --> MyriadX Blob) #779

Closed
ZucchiniAI opened this issue Aug 30, 2022 · 2 comments
Closed

[YoloV7] Blob Converter failed (Onnx --> MyriadX Blob) #779

ZucchiniAI opened this issue Aug 30, 2022 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@ZucchiniAI
Copy link

Hello,

I have custom-trained a Yolo v7 model (Framework from https://github.com/WongKinYiu/yolov7) and exported as Onnx format by using the export function of this framework. I used then the luxonis online blob converter tool (http://blobconverter.luxonis.com/) to covert the model (best_yolov7_03.onnx ) into openVino 2022.1. My goal is to use the custom model with the OAK-D I bought recently.
My system: Ubuntu 20, Python 3.8.10, Onnx 1.12, Pytorch 1.12

From the blob converter I got following error. Please advice. Thank you.
Marco

Conversion error
Error message
_Command failed with exit code 1, command: /app/venvs/venv2022_1/bin/python /app/model_compiler/openvino_2022.1/converter.py --precisions FP16 --output_dir /tmp/blobconverter/67481eda365d4b51bf1153bfd52ded0d --download_dir /tmp/blobconverter/67481eda365d4b51bf1153bfd52ded0d --name best_yolov7_03 --model_root /tmp/blobconverter/67481eda365d4b51bf1153bfd52ded0d
Console output (stdout)
========== Converting best_yolov7_03 to IR (FP16)
Conversion command: /app/venvs/venv2022_1/bin/python -- /app/venvs/venv2022_1/bin/mo --framework=onnx --data_type=FP16 --output_dir=/tmp/blobconverter/67481eda365d4b51bf1153bfd52ded0d/best_yolov7_03/FP16 --model_name=best_yolov7_03 --input= --data_type=FP16 '--mean_values=[127.5,127.5,127.5]' '--scale_values=[255,255,255]' --input_model=/tmp/blobconverter/67481eda365d4b51bf1153bfd52ded0d/best_yolov7_03/FP16/best_yolov7_03.onnx

Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /tmp/blobconverter/67481eda365d4b51bf1153bfd52ded0d/best_yolov7_03/FP16/best_yolov7_03.onnx
- Path for generated IR: /tmp/blobconverter/67481eda365d4b51bf1153bfd52ded0d/best_yolov7_03/FP16
- IR output name: best_yolov7_03
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Source layout: Not specified
- Target layout: Not specified
- Layout: Not specified
- Mean values: [127.5,127.5,127.5]
- Scale values: [255,255,255]
- Scale factor: Not specified
- Precision of IR: FP16
- Enable fusing: True
- User transformations: Not specified
- Reverse input channels: False
- Enable IR generation for fixed input shape: False
- Use the transformations config file: None
Advanced parameters:
- Force the usage of legacy Frontend of Model Optimizer for model conversion into IR: False
- Force the usage of new Frontend of Model Optimizer for model conversion into IR: False
OpenVINO runtime found in: /opt/intel/openvino2022_1/python/python3.8/openvino
OpenVINO runtime version: 2022.1.0-7019-cdb9bec7210-releases/2022/1
Model Optimizer version: 2022.1.0-7019-cdb9bec7210-releases/2022/1
FAILED:
best_yolov7_03
Error output (stderr)
[ ERROR ] -------------------------------------------------
[ ERROR ] ----------------- INTERNAL ERROR ----------------
[ ERROR ] Unexpected exception happened.
[ ERROR ] Please contact Model Optimizer developers and forward the following information:
[ ERROR ] Check 'unknown_operators.empty()' failed at frontends/onnx/frontend/src/core/graph.cpp:131:
OpenVINO does not support the following ONNX operations: TRT.EfficientNMS_TRT

[ ERROR ] Traceback (most recent call last):
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/main.py", line 533, in main
ret_code = driver(argv)
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/main.py", line 489, in driver
graph, ngraph_function = prepare_ir(argv)
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/main.py", line 394, in prepare_ir
ngraph_function = moc_pipeline(argv, moc_front_end)
File "/app/venvs/venv2022_1/lib/python3.8/site-packages/openvino/tools/mo/moc_frontend/pipeline.py", line 147, in moc_pipeline
ngraph_function = moc_front_end.convert(input_model)
RuntimeError: Check 'unknown_operators.empty()' failed at frontends/onnx/frontend/src/core/graph.cpp:131:
OpenVINO does not support the following ONNX operations: TRT.EfficientNMS_TRT

[ ERROR ] ---------------- END OF BUG REPORT --------------_

@ZucchiniAI ZucchiniAI added the bug Something isn't working label Aug 30, 2022
@tersekmatija
Copy link
Contributor

Hey, we have added official support for YoloV7 to our Yolo exporter, currently available at https://tools.luxonis.com/. You can use your trained weights .pt and upload them directly on the above link, choose YoloV7, and click export. This will generate a blob and JSON that you can use with main_api.py at https://tinyurl.com/oak-d-yolo.

Note that we recommend YoloV7-tiny version to achieve near real-time performance, and that heavier models might not be able to run on the camera, but you are welcome to try. We don't have official support for mask and pose versions of YoloV7, as they only provide weights for heavier versions.

From the error it seems that NMS is not a supported operation. If you still wish to export the ONNX, I would suggest you check the graph of the model on netron.app, find name of the layer before NMS operation, and provide --output [node_name] flag to blobconverter under Model optimizer params. Name of the node should be equal to some node before the NMS. Note, however, that you will need to write NMS and post-processing in a Python script and do it on host yourself. This is why I'd recommend going with the first mentioned option.

@tersekmatija tersekmatija changed the title Blob Converter failed (Onnx --> MyriadX Blob) [YoloV7] Blob Converter failed (Onnx --> MyriadX Blob) Sep 2, 2022
@ZucchiniAI
Copy link
Author

Hello,
Thank you very much for your quick and detailed hints!
the above tool made the conversion easily and I could use the main.py on my yolo v7 tiny model.

Obs: the above link https://tinyurl.com/oak-d-yolo was somehow not accessible to me, therefore I found the main.py in https://github.com/luxonis/depthai-experiments/tree/master/gen2-yolo/device-decoding --> I hope it is the same.

At the moment however the inference with OAK-D is very very unstable even if the mAP on testing images was ~ 0.75. --> I will check out if it is a matter of thresholds.
Thanks! Marco

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants