Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference latency #5

Open
jeho-lee opened this issue May 8, 2024 · 1 comment
Open

Inference latency #5

jeho-lee opened this issue May 8, 2024 · 1 comment

Comments

@jeho-lee
Copy link

jeho-lee commented May 8, 2024

Thank you for the great project! Could you please provide some information of inference latency?

@shubham0204
Copy link
Owner

@jeho-lee The latest update to the project adds a Text which shows inference latency in milli-seconds. This excludes the time taken to resize the image or the depth map and only depicts the time taken by the ONNX model
Try it and let me know your thoughts :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants