-
Notifications
You must be signed in to change notification settings - Fork 491
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This PR fixes the frequent NLP bugs in the other PRs #647
Conversation
test/nlp/test_default.py
Outdated
@@ -91,9 +91,13 @@ def test_points_to_evaluate(): | |||
automl_settings = get_automl_settings(estimator_name="transformer_ms") | |||
|
|||
automl_settings["estimator_list"] = ["transformer_ms"] | |||
automl_settings["starting_points"] = "data" | |||
|
|||
automl_settings["starting_points"] = "data:test/nlp/default/" | |||
del automl_settings["fit_kwargs_by_estimator"]["transformer_ms"]["model_path"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did we have "model_path" in fit_kwargs_by_estimator
in the beginning? Shall we remove it in get_automl_settings()
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
get automl settings is for providing a low cost config so it includes overwriting the model to Electra small
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I said fit_kwargs_by_estimator
. Why does it contain "model_path"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We must overwrite model_path to electra small. Do you mean to overwrite "model_path", why do we do it via fit_kwargs_by_estimator instead of other options (such as fit_kwargs)?
Disable py3.6 for text summarization to fix the NLP bug in the other PRs