New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

liquidSVM #2428

Merged
merged 6 commits into from Sep 24, 2018

Conversation

2 participants
@PhilippPro
Copy link
Member

PhilippPro commented Sep 6, 2018

This is the implementation of liquidSVM as mlr-learner. The probability predictions are still technically flawed and they are not (yet) responding to my emails, so unfortunately probability predictions are not provided at the moment.

PhilippPro added some commits Aug 31, 2018

@larskotthoff

This comment has been minimized.

Copy link
Contributor

larskotthoff commented Sep 6, 2018

Could you add some tests please?

@PhilippPro

This comment has been minimized.

Copy link
Member Author

PhilippPro commented Sep 10, 2018

Why are there some errors on the pr and not on the push? I also do not understand the errors very well...

@larskotthoff

This comment has been minimized.

Copy link
Contributor

larskotthoff commented Sep 10, 2018

Ok, it looks like the tests are failing with a segmentation fault. I've tried on two different machines and can't reproduce it. Does it work when you run the tests locally?

@PhilippPro

This comment has been minimized.

Copy link
Member Author

PhilippPro commented Sep 10, 2018

Yes, they were ok. The algorithm only produces warnings in the default version which is not good. Also the probability prediction is not working properly yet. But the package has quite potential regarding performance and runtime and I already contacted the creators.

package = "liquidSVM",
par.set = makeParamSet(
makeIntegerLearnerParam(id = "d", lower = 0L),
makeLogicalLearnerParam(id = "scale"),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

Default for this is TRUE.

cl = "classif.liquidSVM",
package = "liquidSVM",
par.set = makeParamSet(
makeIntegerLearnerParam(id = "d", lower = 0L),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

This should not be tunable (tunable = FALSE).

par.set = makeParamSet(
makeIntegerLearnerParam(id = "d", lower = 0L),
makeLogicalLearnerParam(id = "scale"),
makeIntegerLearnerParam(id = "threads", lower = -1L),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

Default is 0.

makeIntegerLearnerParam(id = "d", lower = 0L),
makeLogicalLearnerParam(id = "scale"),
makeIntegerLearnerParam(id = "threads", lower = -1L),
makeIntegerLearnerParam(id = "partition_choice", lower = 0L),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

Default is 0.

makeLogicalLearnerParam(id = "scale"),
makeIntegerLearnerParam(id = "threads", lower = -1L),
makeIntegerLearnerParam(id = "partition_choice", lower = 0L),
makeIntegerLearnerParam(id = "grid_choice", lower = -2L, upper = 2L),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

How does this interact with gamma_steps, lambda_steps, etc? Are these options mutually exclusive?

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

And all of these options should be not tunable.

makeIntegerLearnerParam(id = "partition_choice", lower = 0L),
makeIntegerLearnerParam(id = "grid_choice", lower = -2L, upper = 2L),
makeIntegerLearnerParam(id = "adaptivity_control", lower = 0L, upper = 2L),
makeIntegerLearnerParam(id = "random_seed", default = 1),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

I didn't find any documentation on what the default random seed is -- where did you find this? It also shouldn't be tunable.

makeIntegerLearnerParam(id = "grid_choice", lower = -2L, upper = 2L),
makeIntegerLearnerParam(id = "adaptivity_control", lower = 0L, upper = 2L),
makeIntegerLearnerParam(id = "random_seed", default = 1),
makeIntegerLearnerParam(id = "fold"),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

Should that be folds? And there should be a minimum for this (and again not tunable).

makeIntegerLearnerParam(id = "adaptivity_control", lower = 0L, upper = 2L),
makeIntegerLearnerParam(id = "random_seed", default = 1),
makeIntegerLearnerParam(id = "fold"),
makeIntegerLearnerParam(id = "clipping"),

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

Seems that the minimum is -1. Also probably shouldn't be tunable.

makeNumericLearnerParam(id = "max_lambda", lower = 0),
makeNumericVectorLearnerParam(id = "lambdas", lower = 0),
makeNumericVectorLearnerParam(id = "c_values", lower = 0),
makeLogicalLearnerParam(id = "useCells")

This comment has been minimized.

@larskotthoff

larskotthoff Sep 10, 2018

Contributor

Seems that the default for this is FALSE. Should be not tunable.

Show resolved Hide resolved R/RLearner_regr_liquidSVM.R
@PhilippPro

This comment has been minimized.

Copy link
Member Author

PhilippPro commented Sep 11, 2018

I tried to change all parameters according to your suggestions. I am unsure about clipping, if it is tunable or not, that's why I did not set tunable = FALSE.

@larskotthoff

This comment has been minimized.

Copy link
Contributor

larskotthoff commented Sep 11, 2018

Thanks, looks good -- what about restrictions between parameters that control the tuning though? If you set all the gamma parameters at the same time, I guess that some of them override others? It may make sense to add requirements saying that if you set a certain parameter, others cannot be set to avoid surprising results when users accidentally set conflicting parameters.

@PhilippPro

This comment has been minimized.

Copy link
Member Author

PhilippPro commented Sep 24, 2018

Thanks, looks good -- what about restrictions between parameters that control the tuning though? If you set all the gamma parameters at the same time, I guess that some of them override others? It may make sense to add requirements saying that if you set a certain parameter, others cannot be set to avoid surprising results when users accidentally set conflicting parameters.

I changed the parameter settings to take into account requirements. Moreover I looked into https://github.com/cran/liquidSVM/blob/master/R/mlr.R and changed the parameter settings slightly.

@larskotthoff

This comment has been minimized.

Copy link
Contributor

larskotthoff commented Sep 24, 2018

Thanks for all your work on this. Merging.

@larskotthoff larskotthoff merged commit ca824b2 into master Sep 24, 2018

2 of 5 checks passed

continuous-integration/appveyor/branch AppVeyor build failed
Details
continuous-integration/appveyor/pr AppVeyor build failed
Details
continuous-integration/travis-ci/push The Travis CI build failed
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
deploy/netlify Deploy preview ready!
Details

@larskotthoff larskotthoff deleted the liquidSVM branch Sep 24, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment