-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MNN time consumption statistics #1
Comments
Hello,@xjcvip007 , when the input size is 320x240, num_thread=4, the inference time on PC is about 33.4731 ms . If you want to know the mobile time cost, you can test it by yourself. |
Thank for your reply, I test it on iphone 7p with default size (640*480) and default config, its inference time is almost 100ms, maybe a little slow. |
@xjcvip007, hello ,which scale_num=5 or 8? ,Actually, you can set the input size 320*240 on your phone! the Inference time of LFFD with the input shape of 320x240 is about 20ms on the Qualcomm Snapdragon 632 CPU when I test it with ncnn. and mnn may be faster. And , are you sure you have set the config.type =MNN_FORWARD_METAL; the num_thread=? |
I use your default config with MNN_FORWARD_CPU and thread=2~ |
@SyGoing |
@dexception ,hello you test it on iphone 7? scale_num is 5? I suggest you set the input_resized size 320x240, In that way ,the time maybe faster, and the precision is also high, I test it on android with that set, about 20ms |
@SyGoing |
Thank you for your sharing~ Can you give time consumption statistics on pc or mobile?
The text was updated successfully, but these errors were encountered: