-
Notifications
You must be signed in to change notification settings - Fork 855
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AutoMM] Fix hyperparameters consistency between init() and fit() #3279
Conversation
Job PR-3279-28209df is done. |
Job PR-3279-70e62f0 is done. |
@@ -126,6 +126,8 @@ def __init__( | |||
verbosity: Optional[int] = 3, | |||
warn_if_exist: Optional[bool] = True, | |||
enable_progress_bar: Optional[bool] = None, | |||
pretrained: Optional[bool] = True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we shall update docstring to reflect his new param
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MultiModalMatcher
is not exposed to users. We can add the docstring later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Docstring added.
|
||
if self._config is None: | ||
# split out the hyperparameters whose values are complex objects | ||
hyperparameters, advanced_hyperparameters = split_hyperparameters(self._hyperparameters) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what's the difference between regular and advanced hps? In future, what's the guiding principal to introduce new ones?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
advanced_hyperparameters are to support more image augmentations from torchvision transforms (https://pytorch.org/vision/stable/transforms.html#geometry). In this case, the values of hyperparameters are callable objects instead of strings or numbers.
@@ -375,7 +375,7 @@ def object_detection(presets: str = DEFAULT): | |||
"optimization.top_k_average_method": "best", | |||
"optimization.warmup_steps": 0.0, | |||
"optimization.patience": 10, | |||
"optimization.val_metric": "map", | |||
# "optimization.val_metric": "map", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why we comment out this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because this PR changes the design of customizing validation metric. optimization.val_metric
has been removed from the default config.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed the commented line. Object detection's default validation metric is map
.
Job PR-3279-98d6804 is done. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with a minor issue
@@ -493,14 +487,17 @@ def match_label(self): | |||
|
|||
@property | |||
def problem_type(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a bit confusing here to include self._matcher._pipeline only in self.problem_type but not in self._problem_type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated to set predictor._matcher._pipeline to predictor._problem_type (https://github.com/autogluon/autogluon/pull/3279/files#diff-a73844176f6eae3b4538467b3565e3bdbf4d88997c3c670173bbb44e1a55b7a3R2700) when loading a saved matcher.
Job PR-3279-98d6804 is done. |
Job PR-3279-3031945 is done. |
Issue #, if available:
#2978
Description of changes:
hyperprameters
consistent between predictor init() and fit().hyperparameters
consistent between matcher init() and fit().config.optimization.val_metric
withvalidation_metric
in init(), which is to reduce hacking and simplify the logic.init_scracth
withpretrained
in init() to align with AutoMM motivation and other internal APIs. AutoMM is mainly to finetune pretrained models.pretrained
is more intuitive for users compared toinit_scratch
. Some internal APIs already usedpretrained
.By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.