Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make new regressors available #256

Closed
desilinguist opened this issue Sep 23, 2015 · 3 comments
Closed

Make new regressors available #256

desilinguist opened this issue Sep 23, 2015 · 3 comments
Assignees
Milestone

Comments

@desilinguist
Copy link
Member

It would be nice to expose the following regressors to SKLL since they can be quite useful in the real world:

linear_model.BayesianRidge  Bayesian ridge regression
linear_model.ElasticNet Linear regression with combined L1 and L2 priors as regularizer.
linear_model.ElasticNetCV Elastic Net model with iterative fitting along a regularization path
linear_model.Lars    Least Angle Regression model a.k.a.
linear_model.LarsCV   Cross-validated Least Angle Regression model
linear_model.LassoCV  Lasso linear model with iterative fitting along a regularization path
linear_model.LassoLars    Lasso model fit with Least Angle Regression a.k.a.
linear_model.LassoLarsCV  Cross-validated Lasso, using the LARS algorithm
linear_model.LassoLarsIC  Lasso model fit with Lars using BIC or AIC for model selection
linear_model.LogisticRegressionCV   Logistic Regression CV (aka logit, MaxEnt) classifier.
linear_model.RidgeCV    Ridge regression with built-in cross-validation.
linear_model.lars_path  Compute Least Angle Regression or Lasso path using LARS algorithm 
linear_model.lasso_path Compute Lasso path with coordinate descent
linear_model.lasso_stability_path   Stabiliy path based on randomized Lasso estimates

Perhaps we can future-proof this in a way so that it's easy to add new models as they are released in subsequent versions of scikit-learn?

@desilinguist desilinguist added this to the 1.2 milestone Sep 23, 2015
@dan-blanchard
Copy link
Contributor

I would say it already is very easy to add new learners. You just need to:

  1. Import them in learner.py
  2. Add the default parameter grid to the _DEFAULT_PARAM_GRIDS dict. One of our main selling points is that we "put some thought" into what these should be, so this can't really be automated that much.
  3. Add a rescaled version of the appropriate class. This is the only part that I think we could really make simpler. We could just replace all of these lines with:
# Convert items to list to prevent exception about modifying while iterating
for name, class_ in list(globals().items()):
    if isinstance(class_, type) and class_ != RegressorMixin and issubclass(class_,
                                                                            RegressorMixin):
        rescaled_name = 'Rescaled{}'.format(name)
        globals()[rescaled_name] = rescaled(class_)

@desilinguist desilinguist removed this from the 1.2 milestone Feb 19, 2016
@desilinguist
Copy link
Member Author

What's the status on this, guys?

@desilinguist desilinguist added this to the 1.3.1 milestone Feb 13, 2017
@desilinguist desilinguist modified the milestones: 1.3.1, 1.5 Aug 18, 2017
@desilinguist
Copy link
Member Author

Addressed by #377.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants