Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converted model.engine does not work properly on DeepStream #120

Open
xarauzo opened this issue Jul 14, 2022 · 0 comments
Open

Converted model.engine does not work properly on DeepStream #120

xarauzo opened this issue Jul 14, 2022 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@xarauzo
Copy link

xarauzo commented Jul 14, 2022

I have converted a MMDet model using this tool and I got an output 'model.engine' model. However, when using DeepStream (with the amirstan plugin) the inference does not work as expected. I get no errors from TensorRT during inference. With a 0.5 threshold I get no detections shown. I decreased the threshold to 0.1 (just to see what happens) and I get a lot of bounding boxes (but none of them are correct).

I am using DeepStream 5.0 on a Jetson Xavier NX, running Jetpack 4.4 (I can't change neither the DeepStream nor the Jetpack versions).

@xarauzo xarauzo added the bug Something isn't working label Jul 14, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants