Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in if (err < tol) break : missing value where TRUE/FALSE needed for 'classif.gausspr' #501

Closed
Seager1989 opened this issue Nov 15, 2020 · 6 comments

Comments

@Seager1989
Copy link

Seager1989 commented Nov 15, 2020

Hi,

I am working on the Hyperparameters tuning for 'classif.gausspr'. The error as follows:
Error in if (err < tol) break : missing value where TRUE/FALSE needed

I tried to fix this problem by following actions but failed.

  1. Set impute.val=1
  2. Set tol=0.1 or bigger

The simplified code is listed below for your reference. Here, the dataset has ten independent features (0~1) and two types of labels.

controlALL = makeTuneControlMBO(budget = 50,impute.val=1)
MLGPR <- makeLearner("classif.gausspr",par.vals=list(kernel="polydot"))        #creat gaussian process classi object
PSGPR <- makeParamSet(makeIntegerParam(id="degree",lower=1L,upper=6L),
                      makeNumericParam(id="scale",default=0.1,lower=0.1,upper=10),
                      makeNumericParam(id="offset",default=0.1,lower=0.1,upper=10))
taskML = makeClassifTask(data=MLdatasetLabel,target=colnames(MLdatasetLabel)[dim(MLdatasetLabel)[2]])    
MLGPROPT=tuneParams(MLGPR,taskML,cv5,par.set=PSGPR,control=controlALL,measures=list(mmce,setAggregation(mmce,test.sd)))

I know that there is an error to calculate the err, but I do not know how to fix it. Any suggestions are appreciated. Thank you

@jakob-r
Copy link
Member

jakob-r commented Nov 16, 2020

It's hard to reproduce the error without the data MLdatasetLabel. Could you post a traceback()?

@Seager1989
Copy link
Author

Thank you for your reply.

The traceback() is attached as 'TracebackofGPRclassifi.txt' and the MLdatasetLabel dataset is attached as 'MLdatasetLabel.xlsx'. Hope these are helpful to fix this problem.

TracebackofGPRclassifi.txt

MLdatasetLabel.xlsx

@Seager1989
Copy link
Author

It is appreciated if anyone can give a comment on this issue. Thank you

@jakob-r
Copy link
Member

jakob-r commented Nov 25, 2020

So the traceback indicates that this is an error of the learner that you try to tune and not mlrMBO itself. It looks like the learner (makeLearner("classif.gausspr",par.vals=list(kernel="polydot")) ) just crashes for the hyperparemter settings suggested by mlrMBO.
According to the traceback the learner was called with the parameter settings fit = FALSE, kernel = "polydot", degree = 4L, scale = 4.52147865854204, offset = 0.899868381844135.
You can either try to change the search space (par.set) to ranges that do not crash or you ignore those cases.
It looks like you already tried the latter by setting impute.val=1. However, to really activate the imputation you have to set mlr to fail silently or with only a warning by setting configureMlr(on.learner.error = "warn").
I will update the documentation in mlr to state that more clearly.

@Seager1989
Copy link
Author

Thank you for your help. The configureMlr(on.learner.error = "warn") works for me.

To change the search space may be difficult. I found the three hyperparameters (polynomial kernel degree, scale, and offset) are coupling with each other with respect to the training crash. It is hard to find a feasible domain with no crash without missing the optimum.

@jakob-r jakob-r closed this as completed Dec 10, 2020
@jakob-r
Copy link
Member

jakob-r commented Dec 10, 2020

You are welcome. Thanks for making us aware of the gap in the documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants