-
-
Notifications
You must be signed in to change notification settings - Fork 345
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Triton Server Integration with DeepStream #47
Comments
Hi, I have no experience with Triton Server, sorry. |
Hi @marcoslucianops, Thanks for your reply. One more doubt is how to run Inference on Two Models using DeepStream. Face Detection model used: https://github.com/biubug6/Face-Detector-1MB-with-landmark I followed https://github.com/marcoslucianops/DeepStream-Yolo/blob/master/multipleInferences.md, but I have two Different Model(one is Yolov3 and Face Detection model). Thanks, |
Looking forward to your reply. |
Your Face Detection model is a caffe model or TensorRT converted model? |
Thanks for your reply, it is TensorRT Converted model |
Can you send me your Face Detection model config_infer.txt file? |
Hi @marcoslucianops, below is the config_infer.txt file for my FD Model
And Below is my Deepstream_app_config.txt
|
add to [secondary-gie0] in deepstream_app_config.txt and [property] in centerface.txt
|
Added it, and worked Perfectly! |
I think it's a problem/bug on Triton |
Hi @marcoslucianops,
Thanks for your Projects. It helped me a lot honestly.
Actually, I have run the Yolov3 model(trained on my Custom Dataset) on Jetson Nano using DeepStream with 4 Cameras. Next, I want to Integrate Triton Server with DeepStream for the same model.
So, my doubts are:
1.) How to do the Integration, what all I need to do extra?
2.) Can I serve the TRT models with the triton Server integrated with DeepStream?
Thanks
The text was updated successfully, but these errors were encountered: