End2End ONNX Export Implementation for YOLOv9 #189
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
#130
YOLOv9 ONNX End2End with TensorRT Efficient NMS plugin
Support for End-to-End ONNX Export: Added support for end-to-end ONNX export in
export.py
andmodels/experimental.py
.Model Compatibility: This functionality currently works with all
DetectionModel
models ;Configuration Variables: Use the following flags to configure the model:
--include onnx_end2end
: Enabled export End2End--topk-all
: ONNX END2END/TF.js NMS: Top-k for all classes to keep (default: 100).--iou-thres
: ONNX END2END/TF.js NMS: IoU threshold (default: 0.45).--conf-thres
: ONNX END2END/TF.js NMS: Confidence threshold (default: 0.25).Example:
#130 (comment) #188 (comment)