Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Tuner base class #351

Merged
merged 10 commits into from Feb 14, 2020
Merged

Add Tuner base class #351

merged 10 commits into from Feb 14, 2020

Conversation

dsherry
Copy link
Collaborator

@dsherry dsherry commented Feb 12, 2020

We currently only have one tuner, SKOptTuner, but @christopherbunn is adding two more in #230 . This PR puts in an abstract base class to define the current interface.

I'd like to consider changing/expanding the tuner API in #272 , but for now this will unblock #230 and provide more test coverage.

@dsherry dsherry added the enhancement An improvement to an existing feature. label Feb 12, 2020


class Tuner(ABC):
"""Base Tuner class"""
Copy link
Collaborator Author

@dsherry dsherry Feb 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I should probably add an API example here for the doc

Copy link
Contributor

@jeremyliweishih jeremyliweishih Feb 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By API example do you mean just part of the API reference? or one of the longer notebooks?

Copy link
Collaborator Author

@dsherry dsherry Feb 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Part of the API reference

@codecov
Copy link

codecov bot commented Feb 12, 2020

Codecov Report

Merging #351 into master will increase coverage by 0.07%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #351      +/-   ##
==========================================
+ Coverage   97.29%   97.36%   +0.07%     
==========================================
  Files         102      104       +2     
  Lines        3180     3266      +86     
==========================================
+ Hits         3094     3180      +86     
  Misses         86       86              
Impacted Files Coverage Δ
evalml/pipelines/pipeline_base.py 98.58% <0.00%> (-0.02%) ⬇️
evalml/tuners/tuner.py 100.00% <0.00%> (ø)
evalml/tests/tuner_tests/test_skopt_tuner.py 100.00% <0.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d442066...4e89678. Read the comment docs.

@dsherry dsherry force-pushed the ds_272_automl_tuner_base_class branch from 5eea368 to 39f35ed Compare Feb 12, 2020
@dsherry dsherry self-assigned this Feb 12, 2020
jeremyliweishih
jeremyliweishih previously approved these changes Feb 12, 2020
Copy link
Contributor

@jeremyliweishih jeremyliweishih left a comment

This looks good to me other than needing tests to pass. One thing I wanted to point out was that it might be clearer to explicitly set random state in the tests to show that propose isn't completely random but it does get tedious and verbose.

In the future we should discuss some of these ideas:

  • User parameterization of skopt: changing base estimator, optimizer, etc.
  • Exploration of skopt settings: do we have the optimal default settings?
  • Should we stick with or move away from add, propose?

"""Given two sets of numeric/str parameter lists, assert numerics are approx equal and strs are equal"""
def separate_numeric_and_str(values):
is_numeric = lambda val: isinstance(val, (int, float))
extract = lambda vals, invert: [el for el in vals if (invert ^ is_numeric(el))]
Copy link
Contributor

@jeremyliweishih jeremyliweishih Feb 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't think it is an issue but would this include boolean values as a "string"?

Copy link
Collaborator Author

@dsherry dsherry Feb 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this code wouldn't work for booleans. If we add those to our supported parameters, we'd have to update this. But we don't support boolean parameters right now, right? I couldn't find any--and we can always represent them as categorical, which may make more sense anyways

Copy link
Contributor

@jeremyliweishih jeremyliweishih Feb 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup makes sense to me 👍

@jeremyliweishih jeremyliweishih dismissed their stale review Feb 12, 2020

Need tests to pass

@dsherry
Copy link
Collaborator Author

dsherry commented Feb 12, 2020

@jeremyliweishih nice, yeah agreed. #272 tracks the last question, of whether we should stick with add/propose.

I'll add fixing the random_state, why not.

@dsherry dsherry force-pushed the ds_272_automl_tuner_base_class branch from d58cd6e to db98a66 Compare Feb 12, 2020
@dsherry
Copy link
Collaborator Author

dsherry commented Feb 12, 2020

Wft, why is codecov failing on this PR?! That's weird 😅

@dsherry dsherry force-pushed the ds_272_automl_tuner_base_class branch from e0d8594 to 3dd327e Compare Feb 13, 2020
pragma: no cover

# Don't complain if tests don't hit defensive assertion code:
raise NotImplementedError
Copy link
Collaborator Author

@dsherry dsherry Feb 13, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is needed to avoid the covtests complaining about the abstract base class:
https://stackoverflow.com/questions/9202723/excluding-abstractproperties-from-coverage-reports

@dsherry dsherry force-pushed the ds_272_automl_tuner_base_class branch from 3dd327e to c09f4b4 Compare Feb 14, 2020
@dsherry dsherry requested a review from jeremyliweishih Feb 14, 2020
@dsherry
Copy link
Collaborator Author

dsherry commented Feb 14, 2020

@christopherbunn @jeremyliweishih this is ready to go, just needs review and approval!

Copy link
Contributor

@jeremyliweishih jeremyliweishih left a comment

:shipit:

@dsherry dsherry merged commit 910b5c1 into master Feb 14, 2020
2 checks passed
@dsherry dsherry deleted the ds_272_automl_tuner_base_class branch Feb 14, 2020
@angela97lin angela97lin mentioned this pull request Mar 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement An improvement to an existing feature.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants