New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
YoloV5 Onnx Model Output Post Processing For Xamarin Forms #7144
Comments
@willsonyee 'output' is all you need. The other 3 are the outputs at each grid P3-P5, which are already present in 'output'. The dimension of 'output' are shape(batch,anchors,outputs), where outputs are [xywh, objectness, classes confidences] |
@glenn-jocher I see, thanks for the clarification on the "output" of the onnx model. Right now the "var output" in my second pictures have the "output" of the onnx model multiplied (1 x 25200 x 7 = 176400). Pardon my dullness, could you guide me on how to post process this? I searched around the github issues, do I need to do something called NMS in C#? Thanks alot |
@willsonyee see https://pytorch.org/vision/main/generated/torchvision.ops.nms.html Your 7-length vector is xywh, objectness, class 1, class 2 |
@glenn-jocher I am currently using Microsoft.ML.OnnxRuntime to load the YoloV5 model and make prediction. But I couldn't postprocess the output from the model to xywh, objectness, class1, class2. Do you have an code examples for onnx C#? Thanks |
@willsonyee see YOLOv5 Export Tutorial for C++ references |
👋 Hello, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs. Access additional YOLOv5 🚀 resources:
Access additional Ultralytics ⚡ resources:
Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed! Thank you for your contributions to YOLOv5 🚀 and Vision AI ⭐! |
Search before asking
Question
I am currently trying to integrate YoloV5 Onnx Model into Xamarin Form C# using OnnxRuntime, but I am currently encountering some roadblocks due to lack of experiencing. Can someone advise me on how to post process the output after inferenced?
Netron for my YoloV5 Onnx Model:
There are total of 4 outputs for my models - output, 345, 403, 461. May I know what are the meaning for these 4 outputs?
I couldn't move forward from here after I get the output tensor, any advises will be much appreciated.
Please let me know if you require more information to further clarify the question.
Thank you very much.
Additional
No response
The text was updated successfully, but these errors were encountered: