-
Notifications
You must be signed in to change notification settings - Fork 7.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add batch predictor #2888
Add batch predictor #2888
Conversation
Add Predictor for batch data
format by linter
Hi @sailist! Thank you for your pull request and welcome to our community. Action RequiredIn order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
B011 Do not call assert False since python -O removes these calls. Instead callers should raise AssertionError()
@sailist Have you tried using this BatchPredictor with AsyncPredictor? |
Fine, I'll try it, thanks. |
Can you add unit test to ensure it is working properly? |
As the doc of DefaultPredictor mentions, it is only for simple demo purposes and we do not plan to extend it with new capabilities. Users are expected to use the |
It's not very convenient when do batch inference by using original detectron2 api, the solution provided here #282 (comment) is good, and I extended it's code to support more forms.