-
-
Notifications
You must be signed in to change notification settings - Fork 345
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How do I insert a YOLOV5 inference module into deepstream in command-line #21
Comments
Replace path of primary inference to your config_infer_primary.txt path and execute your program with this command (with path to your libmyplugins.so file)
Where ... = your program command |
That doesn't seem to work,Is there any other way?Is it possible to place libmyplugins.so in config_infer_primary.txt like 'custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_yolo.so' in config_infer_primary.txt |
I know only this way for YOLOv5. Did you test like this?
Not for now, libnvdsinfer_custom_impl_yolo.so is a deepstream lib and libmyplugins.so is a tensorrtx lib. Need both libs to make YOLOv5 works. |
I will test |
hi,What's going on? Did the test work |
Yes, I updated the repo, use my new files. Now you can run with:
|
thanks,it’s OK . Now it's easier than before. |
hi,How do I insert a YOLOV5 inference engine into a deepstream as below
very thanks for this help
The text was updated successfully, but these errors were encountered: