-
Notifications
You must be signed in to change notification settings - Fork 862
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[multimodal] Unifying dump/export model by introducing ExportMixin #2840
Conversation
Job PR-2840-d9d22dd is done. |
Job PR-2840-bc30e43 is done. |
from .onnx import get_onnx_input | ||
|
||
|
||
class ExportMixin: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will this Mixin also supports matcher?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not quite sure whether it would support matcher. It should behavior exactly as before. This is no implementation change. Just moved dump_model
and export_onnx
out of the MultiModalPredictor
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the refactor!
Job PR-2840-a6df52b is done. |
Issue #, if available:
Description of changes:
multimodal/tests/unittests/others/{test_deployment.py => test_deployment_onnx.py}
multimodal/tests/unittests/{others_2/test_predictor_model_dump.py => others/test_deployment_torch.py}
This would help reduce the size of the predictor.py file.
The dump_model and export_onnx related functions are moved to the ExportMixin class
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.