Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MRG] FIX convert F-ordered array in Ridge with SAG solver #14458

Merged
merged 7 commits into from Jul 26, 2019

Conversation

glemaitre
Copy link
Contributor

@glemaitre glemaitre commented Jul 24, 2019

closes #14457

Make an implicit conversion when array is not C-ordered in make_dataset.

doc/whats_new/v0.22.rst Outdated Show resolved Hide resolved
sklearn/linear_model/base.py Show resolved Hide resolved
rth
rth approved these changes Jul 24, 2019
doc/whats_new/v0.22.rst Outdated Show resolved Hide resolved
@glemaitre glemaitre changed the title FIX convert F-ordered array in Ridge with SAG solver [MRG] FIX convert F-ordered array in Ridge with SAG solver Jul 25, 2019
doc/whats_new/v0.22.rst Outdated Show resolved Hide resolved
TomDLT
TomDLT approved these changes Jul 25, 2019
Copy link
Member

@TomDLT TomDLT left a comment

For the record, this bug was introduced in 0dac63f, as it made Ridge.fit call ridge_regression(check_input=False).

Note that we still have an inconsistency between Ridge.fit and ridge_regression:

  • Ridge.fit does not check contiguity.
  • ridge_regression always forces C-contiguity.

While C-contiguity is only necessary for sag/saga solvers.

:pr:`14108`, :pr:`14170` by :user:`Alex Henrie <alexhenrie>`.

- |Fix| :class:`linear_model.Ridge` with `solver='sag'` now accepts F-ordered
and noon-contiguous arrays and make a conversion instead of failing.
Copy link
Member

@TomDLT TomDLT Jul 25, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

noon -> non
make -> makes

@glemaitre
Copy link
Contributor Author

@glemaitre glemaitre commented Jul 25, 2019

Note that we still have an inconsistency between Ridge.fit and ridge_regression:

Yep, the check_input has been bypassed to avoid dtype conversion but letting the contiguity on the side.

@TomDLT Would it be better to make the contiguity check at the estimator level or to make the conversion as in this PR.

@TomDLT
Copy link
Member

@TomDLT TomDLT commented Jul 25, 2019

@TomDLT Would it be better to make the contiguity check at the estimator level or to make the conversion as in this PR.

For sag/saga solvers, it probably does not make a difference.
For other solvers, there is an unnecessary C-contiguity enforcement (with potentially a copy) in ridge_regression.

What about keeping the solution in this PR, but removing the contiguity check in ridge_regression ?
We can also extend the test to all solvers, and maybe to ridge_regression too. (isn't there a common test to check that ?)

@glemaitre
Copy link
Contributor Author

@glemaitre glemaitre commented Jul 26, 2019

What about keeping the solution in this PR, but removing the contiguity check in ridge_regression ?

OK let's do that.

We can also extend the test to all solvers, and maybe to ridge_regression too

@rth was starting to check if the estimator with different solver could indeed pass the common tests. This might be better than writing redundant tests but would require more work.

rth
rth approved these changes Jul 26, 2019
@rth rth merged commit df1e3fb into scikit-learn:master Jul 26, 2019
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants