Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add instance-level calls to estimator_checks for meta-estimators #9443

Open
5 tasks
amueller opened this issue Jul 24, 2017 · 4 comments
Open
5 tasks

Add instance-level calls to estimator_checks for meta-estimators #9443

amueller opened this issue Jul 24, 2017 · 4 comments
Labels
help wanted module:test-suite everything related to our tests

Comments

@amueller
Copy link
Member

amueller commented Jul 24, 2017

We recently added the ability to run estimator_checks on instances.
We should add tests for the meta-estimators running estimator_checks on common settings, including

  • pipeline
  • feature union
  • GridSearchCV(Estimator())
  • GridSearchCV(Pipeline(...))
  • RandomizedSearchCV(...)

And then possibly additional tests for anything that is currently not tested by the common tests. I think they raise SkipTest warnings, but I'm not sure? There's at least SparseEncode, RFECV, SelectFromModel, and probably more, that are not tested. Maybe CalibratedClassifierCV, VotingClassifier?

If someone wants to work on this, they can start with the five above and check out which other estimators are not tested and post a list here...

@amueller amueller added Easy Well-defined and straightforward way to resolve Need Contributor labels Jul 24, 2017
@Sentient07
Copy link
Contributor

Hi, I would like to work on this.

We recently added the ability to run estimator_checks on instances.

Could you please elaborate this?

From what I understand, currently, there is a parameter skip_methods (of the class DelegatorData). When this class is initialised(inside the list DELEGATING_METAESTIMATORS) the skip_methods parameter contains some methods of those class that are omitted from testing. If I understand correctly, we have to eliminate that parameter(skip_method) or set it to None and ensure the tests pass. Am I correct?

There's at least SparseEncode, RFECV, SelectFromModel, and probably more, that are not tested. Maybe CalibratedClassifierCV, VotingClassifier?

We have to add these instances to DELEGATING_METAESTIMATORS, right?

@amueller
Copy link
Member Author

You seem to be talking about the test_metaestimator_delegation which tests that meta-estimators delegate methods appropriately. That's not very related to what I'm proposing.

What we need to do is add a new test and call check_estimator on some instances of these estimators, like check_estimator(make_pipeline(StandardScaler(), SVC()))

@Sentient07
Copy link
Contributor

Ah! Okay makes sense now.

I think they raise SkipTest warnings, but I'm not sure?

Here, do you mean the warnings raised in line 269 of sklearns/utils/estimator_checks.py? And in this PR, once we add tests using check_estimator for the above said instances, do you want to remove that warning as well?

@amueller
Copy link
Member Author

Hm no, the check I meant isn't in master yet, it's only in #8022 it seems.

@cmarmo cmarmo added help wanted and removed Easy Well-defined and straightforward way to resolve labels Apr 21, 2021
@cmarmo cmarmo added the module:test-suite everything related to our tests label Dec 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted module:test-suite everything related to our tests
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants