Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AutoMM] Refactor presets #2749

Merged
merged 7 commits into from
Jan 28, 2023
Merged

Conversation

zhiqiangdon
Copy link
Contributor

@zhiqiangdon zhiqiangdon commented Jan 24, 2023

Issue #, if available:

  1. Many redundant hyperparameters exist in presets, increasing the maintainance difficulty.
  2. It's difficult to add HPO presets based on the current preset design.

Description of changes:

  1. Reorganize the presets by problem type. No need to repeat the shared hyperparameters. The differences among presets of the same problem type are more clear.
  2. Use best_quality, high_quality, and medium_quality as presets.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@github-actions
Copy link

Job PR-2749-fe3e29c is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-2749/fe3e29c/index.html

Copy link
Contributor

@bryanyzhu bryanyzhu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with a comment.

hyperparameters.update(
{
"model.hf_text.checkpoint_name": "google/electra-small-discriminator",
"model.timm_image.checkpoint_name": "swin_small_patch4_window7_224",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the quick build setting of image classification, we recommend to use mobilenetv3_large_100 before, we might want to stay consistent.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed to mobilenetv3_large_100.



@automm_presets.register()
def zero_shot_image_classification():
return {
def zero_shot_image_classification(presets: str = DEFAULT, hpo: bool = False):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this be too complicated? Can we just register presets a few strings?

The current design is that you need to provide a presets flag in automm_presets.register().

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, consider to add docstrings to explain the design. For example, I can understand the current presets are registered per problem_type.

For example, for image_classification, we can have its own presets, like DEFAULT, BEST_QUALITY, etc.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The string design is not easy to maintain and scale to more problem types and HPO presets. There are much redundancy in the strings (one problem type name may repeat three times) and hyperparameters (different presets of the same problem type may share common ones).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Docstrings added.

from .registry import Registry

automm_presets = Registry("automm_presets")
matcher_presets = Registry("matcher_presets")


@automm_presets.register()
def high_quality_fast_inference():
return {
def default(presets: str = DEFAULT, hpo: bool = False):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need hpo flag?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to support the HPO presets, i.e., returning not only hyperparameters but also hyperparameter_tune_kwargs.

@github-actions
Copy link

Job PR-2749-876c5c2 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-2749/876c5c2/index.html

Copy link
Contributor

@FANGAreNotGnu FANGAreNotGnu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, will update detection's preset by 0.7 release.

Copy link
Contributor

@cheungdaven cheungdaven left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the refactor.

Copy link
Contributor

@bryanyzhu bryanyzhu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the refactor!

@github-actions
Copy link

Job PR-2749-c9cd115 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-2749/c9cd115/index.html

@github-actions
Copy link

Job PR-2749-8940394 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-2749/8940394/index.html

Copy link
Collaborator

@sxjscience sxjscience left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@sxjscience sxjscience merged commit 95f573a into autogluon:master Jan 28, 2023
@zhiqiangdon zhiqiangdon deleted the mm-presets branch January 29, 2023 01:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants