-
Notifications
You must be signed in to change notification settings - Fork 5.5k
Batch Inference #84
Comments
@filipetrocadoferreira I've implemented this for Faster RCNN and so far it seems to work ok with |
Which files did you change? |
All changes are to be made in |
Nice, feel free to post the commit 👍 |
Sure, there you go: |
Thanks a lot :) |
Yes it will for sure. See whereever it says |
Keypoint Batch Inference is not straight forward. I'm facing some problems here:
worst. Now I'm aware that single image inference is composed by multiple data exchange between gpu<->cpu which will be ,prolly, the main bottleneck of performance |
How to easily take advantage of batch processing during inference?
The text was updated successfully, but these errors were encountered: