-
Notifications
You must be signed in to change notification settings - Fork 861
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AutoMM] Refactor presets #2749
Conversation
Job PR-2749-fe3e29c is done. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with a comment.
hyperparameters.update( | ||
{ | ||
"model.hf_text.checkpoint_name": "google/electra-small-discriminator", | ||
"model.timm_image.checkpoint_name": "swin_small_patch4_window7_224", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the quick build setting of image classification, we recommend to use mobilenetv3_large_100
before, we might want to stay consistent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed to mobilenetv3_large_100
.
|
||
|
||
@automm_presets.register() | ||
def zero_shot_image_classification(): | ||
return { | ||
def zero_shot_image_classification(presets: str = DEFAULT, hpo: bool = False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will this be too complicated? Can we just register presets a few strings?
The current design is that you need to provide a presets
flag in automm_presets.register()
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, consider to add docstrings to explain the design. For example, I can understand the current presets are registered per problem_type.
For example, for image_classification
, we can have its own presets, like DEFAULT
, BEST_QUALITY
, etc.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The string design is not easy to maintain and scale to more problem types and HPO presets. There are much redundancy in the strings (one problem type name may repeat three times) and hyperparameters (different presets of the same problem type may share common ones).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Docstrings added.
from .registry import Registry | ||
|
||
automm_presets = Registry("automm_presets") | ||
matcher_presets = Registry("matcher_presets") | ||
|
||
|
||
@automm_presets.register() | ||
def high_quality_fast_inference(): | ||
return { | ||
def default(presets: str = DEFAULT, hpo: bool = False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need hpo
flag?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is to support the HPO presets, i.e., returning not only hyperparameters but also hyperparameter_tune_kwargs.
Job PR-2749-876c5c2 is done. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, will update detection's preset by 0.7 release.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Thanks for the refactor.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for the refactor!
Job PR-2749-c9cd115 is done. |
Job PR-2749-8940394 is done. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Issue #, if available:
Description of changes:
best_quality
,high_quality
, andmedium_quality
as presets.By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.