-
Notifications
You must be signed in to change notification settings - Fork 861
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support presets in AutoMM predictor initialization #2620
Conversation
Job PR-2620-e988188 is done. |
"model.ner_text.checkpoint_name": "microsoft/deberta-v3-base", | ||
} | ||
|
||
|
||
@automm_presets.register() | ||
def ner(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we call it default_ner
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently, all specific problem types, e.g., image_text_similarity
and object_detection
, have registered presets. default
is for the general problem types binary
, multiclass
, and regression
.
LGTM in general. Minor comment on the naming. We can consider to register |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. One side question, where should we introduce these presets to users, in the apis documentation or automm customization tutorial page?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. We can add an additional tutorial to introduce presets
Issue #, if available:
Description of changes:
presets
in predictor initialization. The presets is to control model quality:best_quality
,high_quality_fast_inference
, andmedium_quality_faster_inference
.By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.