-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create search_space from TuneTokens #285
Conversation
R/TuningInstanceMulticrit.R
Outdated
search_space = NULL, terminator, store_models = FALSE, | ||
check_values = FALSE, store_benchmark_result = TRUE) { | ||
|
||
if (!is.null(search_space) && length(learner$param_set$get_values(tune_token = "only")) > 0) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
$get_values(tune_token = "only")
is not the final API, right?
R/TuningInstanceSingleCrit.R
Outdated
codomain = self$objective$codomain, check_values = check_values) | ||
self$objective$archive = self$archive | ||
initialize = function(task, learner, resampling, measure, | ||
search_space = NULL, terminator, store_benchmark_result = TRUE, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that the search space is optional it would make sense to put it after the terminator? Does that break a lot of stuff?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed it.
Co-authored-by: Jakob Richter <code@jakob-r.de>
Co-authored-by: Jakob Richter <code@jakob-r.de>
@be-marc Can you check if adaptation are needed here: mlr-org/mlr3book#209 (and in case make a PR to this branch or talk to @mb706 and edit directly) |
Fails since we try to remove the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the autotuner searchspace from paramset code is not tested yet? Also see my comments on how I think the "requires" problem should be solved.
#' terminator = terminator, | ||
#' tuner = tuner, | ||
#' search_space = search_space, | ||
#' store_tuning_instance = TRUE) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
imho this is a bit verbose for example code where every argument has exactly the right name already. Are you adding this for the sake of consistency with something? I guess its good practice to call functions with many arguments like this, but I think it subtracts from the example code qua example code.
learner = assert_learner(learner)$clone(deep = TRUE) | ||
|
||
if (!is.null(search_space) && length(learner$param_set$get_values(type = "only_token")) > 0) { | ||
stop("If the values of the ParamSet of the Learner contain TuneTokens you cannot supply a search_space.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know this is what the committee decided in the end so you can ignore this comment, but just for the record I am against this.
R/AutoTuner.R
Outdated
} | ||
if (is.null(search_space)) { | ||
search_space = learner$param_set$search_space() | ||
learner$param_set$values = learner$param_set$get_values(type = "without_token") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you cannot do this here because this unsets parameters that may be required. Instead you should be doing this inside the tuning code.
One option here would be to leave search_space NULL
, to leave the learner$param_set
as it is, and trust the TuningInstance
initialization code to take care of this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another way would be to have a "placeholder" token, similar to TuneToken, that prevents the "requires"
check from triggering but has no other functionality. TuneToken
could even inherit from PlaceholderToken
. The semantics would be that whenever a ParamSet$values
slot contains a PlaceholderToken
, then its "requires"
are fulfilled, its dependencies are fulfilled, but calling Learner$train()
should throw an error (because the placeholder would be replaced by a concrete value in the tuning code). I.e. PlaceholderToken
would be doing some part of what TuneToken
is doing already, with the exception that PlaceholderToken
does not generate a search_space
.
stop("param_set is read-only.") | ||
} | ||
private$.param_set | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sic transit gloria mundi
R/TuningInstanceMulticrit.R
Outdated
} | ||
if (is.null(search_space)) { | ||
search_space = learner$param_set$search_space() | ||
learner$param_set$values = learner$param_set$get_values(type = "without_token") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
don't do this here, gives a problem when there are requirements. instead do it in the evaluation code in ObjectiveTuning$eval_many
like here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wait, if this is not possible here how was it even possible before we introduced the tune_tokens? It was'nt right? So your example is the following:
We tune over Param A which can have values aX and aY.
We set Param X that needs A to be aX in the param_set$values
to a fixed value - as well we set Param Y that needs A to ba aY to a fixed value.... right?
This was not possible before, and should be possible now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pipeop = po("branch", options = c("kknn", "svm", "ranger"))
pipeop$param_set
<ParamSet:branch>
id class lower upper levels default value
1: selection ParamFct NA NA kknn,svm,ranger <NoDefault[3]> kknn
selection
has the tag required
. PipeOpBranch
automatically sets the required value to the first option. Currently we try to remove this values / TuneToken
in TuningInstanceSingleCrit
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was possible before because we just didn't touch the param_set$values
before, and in the tuning loop we just overwrite param_set$values
without clearing values.
I am dissatisfied with the solution to store a learner with |
No description provided.