New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove default tuning objective and make it a required field. #381
Comments
I am not really sure how to do this honestly since the default objective is set during config parsing and to override the defaults means actually checking whether all the learners are regressors and doing so means actually instantiating them early during config parsing itself into objects to check their I am trying to think of a better solution. Suggestions and thoughts welcome. |
Couldn't we do something a little less sophisticated and just hard code the string names of the regressors and classifiers wrapped by skll? We do that in other places for other reasons. It wouldn't work for a custom learner though. |
So, then we have another place where we have to update the names of learners when we add new ones? I am not a fan of that approach - what if someone forgets to update the list? Then we need to add more tests for that scenario and basically increase the complexity of the codebase. Plus it wouldn't work for a custom learner like you said. May be we should just not have a default objective at all? If you don't specify an objective in the config file, you get an early error during config parsing itself. This way, you don't get an incomplete experiment and the responsibility of the objective is on the user itself. It would break backward compatibility though for old config files which is not ideal either. |
Well if we don't want to add complexity to the config reading by doing the checks there, then I think the next best thing is to not have a default, but I suppose that would have to be part of a |
Agreed. Let's do that in v2.0. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions. |
We will fix this in v2.0. Keep it open. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions. |
Keep it open for v2.0 |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions. |
keep it open |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions. |
keep it open please |
As part of this issue, I am also going to get rid of the |
Addressed by #458. |
Right now, the default metric for all learners is
f1_score_micro
which doesn't make sense for regressors.(Note: This issue is related to but separate from #350.)
The text was updated successfully, but these errors were encountered: