Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix deprecation warnings in tests #5089

Closed
5 tasks
amueller opened this issue Aug 5, 2015 · 15 comments
Closed
5 tasks

Fix deprecation warnings in tests #5089

amueller opened this issue Aug 5, 2015 · 15 comments
Labels
Easy Well-defined and straightforward way to resolve Enhancement
Milestone

Comments

@amueller
Copy link
Member

amueller commented Aug 5, 2015

There are way too many warnings in the test suite, see https://travis-ci.org/scikit-learn/scikit-learn/jobs/74122431

  • Function decision_function is deprecated [decision_function was removed from regressors]
  • DataDimensionalityWarning in random_projection. Not sure if the test should just ignore those or we should have different defaults? ping @ogrisel
  • "Default 'multioutput' behavior now corresponds to 'variance_weighted' value" not sure if there is a way around these? ping @arjoly @jnothman ?
  • "The decision_function_shape default value will change from 'ovo' to 'ovr' in 0.18. This will change the shape of the decision function returned by SVC." SVC has a new decision_function shape. Should either be ignored or the shape should be set to ovr explicitly for now.
  • ConvergenceWarning: Objective did not converge. These are in coordinate_decent in the linear models. I'm not sure when they started showing up. Maybe @agramfort knows?
@amueller
Copy link
Member Author

amueller commented Aug 5, 2015

the decision_function and decision_function_shape ones should be pretty straight-forward.

Also hunting for more is always possible ;)

@amueller
Copy link
Member Author

amueller commented Aug 5, 2015

to be clear: there should definitely be no "DeprecationWarnings" in the tests, and ideally there would be no warnings.

@amueller amueller added Easy Well-defined and straightforward way to resolve Enhancement Need Contributor labels Aug 5, 2015
@jnothman
Copy link
Member

jnothman commented Aug 6, 2015

To be clearer: we may retain tests that use deprecated behaviour to ensure that deprecated contracts are still maintained for backwards compatibility, but any warnings should be silenced. Otherwise, we shouldn't be using deprecated constructs from our own or imported libraries in any testing or tested code.

Sorry, I don't know anything about variance_weighted.

@arjoly
Copy link
Member

arjoly commented Aug 6, 2015

"Default 'multioutput' behavior now corresponds to 'variance_weighted' value" not sure if there is a way around these? ping @arjoly @jnothman ?

We have deprecated the old default behavior in favour of the new one which is more intuitive (macro r2-score instead of variance weighted macro averaging). Setting manually the new or olds strategy will remove those warnings.

@arjoly
Copy link
Member

arjoly commented Aug 6, 2015

DataDimensionalityWarning in random_projection. Not sure if the test should just ignore those or we should have different defaults? ping @ogrisel

Those were added since people are supposed to do dimensionality reduction with random projection. There is also nothing wrong to have more projections than dimensions in the original space.

@amueller
Copy link
Member Author

amueller commented Aug 6, 2015

@jnothman I agree, but I think that most tests that hit deprecated behavior are actually not the ones for the class / function that was deprecated, but in other parts of the code. And they shouldn't be using deprecated behavior. Having tests that the backward-compatibility still works is obviously a good idea.

@amueller
Copy link
Member Author

amueller commented Aug 6, 2015

@arjoly the problem is that there is no way to manually set the strategy when using the score method of an estimator, right?

And for the random_projections: It is just a bit odd that the tests seem to be throwing so many warnings. That means either the defaults are not sensible or the tests are odd. If the warnings are expected, they should be caught. If they are not expected, something needs to be fixed.
Linear random projections (which is what this is) to more than the original space don't really make sense.

@amueller
Copy link
Member Author

amueller commented Aug 6, 2015

Also, I feel it is the responsibility of the person deprecating / changing a behavior to make sure that the tests still do something sensibly and that warnings are caught when they are expected (I'm guilty of not doing this properly for both regressor decision_functions and the svm decision_function it seems).

@arjoly
Copy link
Member

arjoly commented Aug 6, 2015

@arjoly the problem is that there is no way to manually set the strategy when using the score method of an estimator, right?

It's possible to set the proper keyword in the mixin.

And for the random_projections: It is just a bit odd that the tests seem to be throwing so many warnings. That means either the defaults are not sensible or the tests are odd. If the warnings are expected, they should be caught. If they are not expected, something needs to be fixed.

The default is inferred using the JL-lemma which is stated for many dimension and not for a few.

@amueller
Copy link
Member Author

amueller commented Aug 6, 2015

It's possible to set the proper keyword in the mixin.

That would be a backward incompatible change, though.

The default is inferred using the JL-lemma which is stated for many dimension and not for a few.

Is that a sensible default? I would imagine this to be a conservative estimate. Or should we just catch them all in the tests because the tests are not really in the normal range this would be applied to?

@amueller
Copy link
Member Author

some should be fixed here #5143

@amueller amueller added this to the 0.17 milestone Sep 9, 2015
@hasancansaral
Copy link

@amueller As for the DataDimensionalityWarnings, I have suppressed all of them except for one, which I get when I run nosetest -v sklearn/neighbors but not when nosetests -v sklearn/neighbors/tests. Could you help me with that? Thanks.

@raghavrv
Copy link
Member

There are some QDA and LDA warnings in tests... (Not sure if someone is already working on it...)

@amueller
Copy link
Member Author

There are? Only at the very beginning for the import, right? I don't think we can get rid of these.

@amueller
Copy link
Member Author

amueller commented Nov 2, 2015

closing as I think all are fixed.

@amueller amueller closed this as completed Nov 2, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Easy Well-defined and straightforward way to resolve Enhancement
Projects
None yet
Development

No branches or pull requests

5 participants