Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to check per image inference time for any custom object detection model? and also check model complexity(flops, params)? #10494

Open
minhaz109074 opened this issue Feb 13, 2022 · 5 comments

Comments

@minhaz109074
Copy link

minhaz109074 commented Feb 13, 2022

Hello people,
I trained SSD-MobileNetv1 on my custom dataset . Now I want check the time it take to detect a particular image on gpu and cpu. I also want to check model params and flops
Please help me to do this task.

@minhaz109074 minhaz109074 changed the title How to check per image inference time for any custom object detection model? How to check per image inference time for any custom object detection model? and also check model complexity(flops, params)? Feb 13, 2022
@Kannan665
Copy link

You can use the following python codes to check for inferencing using your system's CPU or GPU.......I referred to this github page and made few changes to the code, to check for inferencing on my saved_model on images and videos. While inferencing the videos, the frame rate is displayed which is one of the metric to be observed, if you want your model to do liver inferencing........https://github.com/abdelrahman-gaber/tf2-object-detection-api-tutorial.....While you go through the link above, will try to get the arguments to be passed for doing the inferencing using the saved_model on your CPU/GPU......
detect_objects.zip

@kumariko kumariko self-assigned this Feb 14, 2022
@minhaz109074
Copy link
Author

minhaz109074 commented Feb 14, 2022

@Kannan665 Thanks for your reply . I just completed inferencing. I faced some issues while running the script you have given in Colab. But I solved these issues. Now could you please tell me how to find out flops/params of the saved model.

@kumariko
Copy link

minhaz109074 Could you please refer to this link to know about flops of the models and let us know if it helps. Thanks!

@kumariko kumariko added the stat:awaiting response Waiting on input from the contributor label Feb 15, 2022
@minhaz109074
Copy link
Author

@kumariko The link you provided contains code for keras sequential model. But I have a saved model in .pb format. Is there any code snippet which takes .pb format model as arguments and returns flops/params?

@kumariko kumariko removed the stat:awaiting response Waiting on input from the contributor label Feb 16, 2022
@kumariko kumariko assigned tombstone, jch1 and pkulzc and unassigned kumariko Feb 16, 2022
@Annieliaquat
Copy link

@Kannan665 Thanks for your reply . I just completed inferencing. I faced some issues while running the script you have given in Colab. But I solved these issues. Now could you please tell me how to find out flops/params of the saved model.

Can you please let me know that how did you find the inference time??

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants