Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

detect_ros.py inference time slower than detect.py #11

Closed
terrytao19 opened this issue Jan 26, 2023 · 4 comments
Closed

detect_ros.py inference time slower than detect.py #11

terrytao19 opened this issue Jan 26, 2023 · 4 comments

Comments

@terrytao19
Copy link

Hi, I'm running on a custom v7-tiny model, and the inference times when running detect_ros.py is at around 500ms while the inference times on detect.py is around 150ms, I added this to track the times:
image

@lukazso
Copy link
Owner

lukazso commented Feb 1, 2023

Hi @terrytao19, can you post the image size before feeding the image into the network in both detect.py and detect_ros.py?

@terrytao19
Copy link
Author

Hi @lukazso , I am using the default 640 image size, I stopped running on vm so my speeds are faster but there is still a difference:
image
On the left is stock yolo while the right is running on ros.

@lukazso
Copy link
Owner

lukazso commented Feb 3, 2023

Please check the image size in the code. E.g. print the image size in both scripts right before model inference.

@lukazso lukazso closed this as not planned Won't fix, can't repro, duplicate, stale Mar 20, 2023
@psykaunot
Copy link

Hello @lukazso , I have the same problem, but I cannot fix it. My input images are all 640 x 480 and after rescaling, I obtained 640 x640.
This is what is obtained from the detect_ros.py script.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants