-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compatibility with scikit-learn v0.16.1 #243
Conversation
- Remove SVR coefficient correction (#111) now that the original bug has been fixed in sklearn. - Remove dependency on `BaseLibLinear` in `model_params()` since it is no longer exposed in scikit-learn. - Expose `LinearSVR` from scikit-learn and so do not have SVR have a 'linear' kernel by default. - Include `RescaledLinearSVR` among the rescaled regressors. - Remove `SVR` from `test_linear_models()` in `test_regression.py` and create a new `test_non_linear_models()`.
- Update one of the expected values in `test_sparse_predict_sampler()` likely due to the following item in the scikit-learn 0.16 release notes: "RBFSampler with gamma=g formerly approximated rbf_kernel with gamma=g/2.; the definition of gamma is now consistent, which may substantially change your results if you use a fixed value. " - Fix typo in comment.
- Remove `SVR` from `test_linear_models()` in `test_regression.py` and create a new `test_non_linear_models()`.
- Update expected values in `test_scaling()` likely due to this item in the scikit-learn v0.16 release notes: "Fix numerical stability issues in linear_model.SGDClassifier and linear_model.SGDRegressor by clipping large gradients and ensuring that weight decay rescaling is always positive".
- Since we do not specify 'linear' as the deafult kernel anymore.
@dan-blanchard and @mheilman, I assigned @aoifecahill because I can only assign one person to a PR but can you guys also take a look, if you can spare the time? |
- Increasing test coverage.
I am not going to spend any more time worrying about the coverage here since (a) I can't figure out what exactly what lines are not covered anymore (coveralls.io shows all lines that are not covered, not just the ones that lost coverage), and (b) because the decrease is so small. |
Looks fine to me :) |
Conflicts: .travis.yml requirements_rtd.txt
…n_compatibility Compatibility with scikit-learn v0.16.1
scikit-learn
to version 0.16.1 in.travis.yml
,requirements.txt
, andrequirements_rtd.txt
.BaseLibLinear
inmodel_params()
since it is no longer exposed in scikit-learn (skll breaks importing missing class BaseLibLinear with scikit-learn 0.16 #233, Changes based on big LibLinear refactoring in scikit-learn #235).LinearSVR
from scikit-learn and so changeSVR
to not have a 'linear' kernel by default.RescaledLinearSVR
among the rescaled regressors.SVR
fromtest_linear_models()
intest_regression.py
and create a newtest_non_linear_models()
.test_sparse_predict_sampler()
likely due to the following item in the scikit-learn 0.16 release notes: "RBFSampler with gamma=g formerly approximated rbf_kernel with gamma=g/2.; the definition of gamma is now consistent, which may substantially change your results if you use a fixed value."test_scaling()
likely due to this item in the scikit-learn v0.16 release notes: "Fix numerical stability issues in linear_model.SGDClassifier and linear_model.SGDRegressor by clipping large gradients and ensuring that weight decay rescaling is always positive."