Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Irace tuning doesn't work with dependencies #369

Closed
larskotthoff opened this issue Jul 2, 2015 · 8 comments
Closed

Irace tuning doesn't work with dependencies #369

larskotthoff opened this issue Jul 2, 2015 · 8 comments
Milestone

Comments

@larskotthoff
Copy link
Sponsor Member

Tuning with irace gives an error when the parameter sets have dependencies as defined in the learners, e.g.

library(mlr)

rdesc = makeResampleDesc("Holdout")
ctrl = makeTuneControlIrace(maxExperiments = 40, nbIterations = 2, minNbSurvival = 1)
ps = makeParamSet(
  makeDiscreteLearnerParam(id = "type", default = "C-svc", values = c("C-svc", "nu-svc", "C-bsvc", "spoc-svc", "kbb-svc")),
  makeNumericLearnerParam(id = "C", lower = 0, default = 1, upper = 1e6, requires = expression(type %in% c("C-svc", "C-bsvc", "spoc-svc", "kbb-svc")))
)
res = tuneParams("classif.ksvm", task = iris.task, resampling = rdesc, par.set = ps, control = ctrl)

gives me

Error in !is.na(v) && v : invalid 'y' type in 'x && y'
In addition: Warning message:
In is.na(v) : is.na() applied to non-(list or vector) of type 'expression'

Looking at the unit tests for Irace, dependencies are specified with quote() instead of expression() -- only expression() is used in the definition of the learners.

@berndbischl
Copy link
Sponsor Member

Well it DOES work.

2 hints:

  1. you are allowed to give default values in the ParamSet here. But this does not influence the
    optimizer currently, so you could leave them out.

  2. Instead of using make_LearnerParam you are supposed to write make_Param.
    But this should not hurt either, the LearnerParam is a subclass / specialization that simply
    adds a few fields in the class (which you dont need here)

Looking at the unit tests for Irace, dependencies are specified with quote() instead of expression() --
only expression() is used in the definition of the learners.

Well the answer seems to be you simple need to exchange "expression" with "qoute"?
Then it works, and it should also be documented like this? Where exactly is the problem then?

Close? Improve docs? Suggestion?

@larskotthoff
Copy link
Sponsor Member Author

Well the problem is that the definition of the learner param sets in the mlr source uses expression() -- I just copied and pasted the parameter definition code from there. So if the answer is to use quote(), it needs to be changed for all the learners.

@berndbischl
Copy link
Sponsor Member

Hmm, ok I need to check this. This relates to this here:
#240

We dont really use the "requires" in the learner params inside of the learner definitions, except for sanity checking of inputs. But I will check this later

@larskotthoff
Copy link
Sponsor Member Author

On a related note, I wasn't able to get any tuning to work with the parameter sets defined in the learner (i.e. par.set = getParamSet(lrn)) because of infinite bounds or similar. I think it would be good if we can make this "just work" or at least document/explain why it doesn't work out of the box and what to do about it.

@berndbischl
Copy link
Sponsor Member

Yes, but this is very clear. The sets in the learners define the whole possible space. You are supposed to write down the feasible region yourself for the optimizer.

But @kerschke is therefore supposed to define default tuning par sets for the learners in gsoc. Which would allow EXACTLY what you tried to do.

@larskotthoff
Copy link
Sponsor Member Author

Great, looking forward to that.

@berndbischl
Copy link
Sponsor Member

i have added checks now that ensure that "expression" is neither used in the learner$param.set nor the tuning param.set.
and improved docs.

@berndbischl
Copy link
Sponsor Member

added unit tests for RLearners and tuning

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants