Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove lightgbm from automl #1186

Merged
merged 11 commits into from
Sep 17, 2020
Merged

remove lightgbm from automl #1186

merged 11 commits into from
Sep 17, 2020

Conversation

bchen1116
Copy link
Contributor

Remove LightGBM from AutoML models for now, as it was added without proper performance tests.

@codecov
Copy link

codecov bot commented Sep 16, 2020

Codecov Report

Merging #1186 into main will decrease coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #1186      +/-   ##
==========================================
- Coverage   99.92%   99.92%   -0.01%     
==========================================
  Files         196      196              
  Lines       11814    11813       -1     
==========================================
- Hits        11805    11804       -1     
  Misses          9        9              
Impacted Files Coverage Δ
...lml/tests/model_understanding_tests/test_graphs.py 100.00% <ø> (ø)
evalml/utils/gen_utils.py 98.94% <ø> (ø)
evalml/tests/pipeline_tests/test_pipelines.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update be126c6...59effba. Read the comment docs.

@bchen1116 bchen1116 self-assigned this Sep 16, 2020
@bchen1116
Copy link
Contributor Author

Codecov/project not passing, likely due to the untested line in gen_utils.py for jupyter_check.

Copy link
Contributor

@dsherry dsherry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍

@@ -155,7 +155,7 @@ def _get_subclasses(base_class):

_not_used_in_automl = {'BaselineClassifier', 'BaselineRegressor',
'ModeBaselineBinaryPipeline', 'BaselineBinaryPipeline', 'MeanBaselineRegressionPipeline',
'BaselineRegressionPipeline', 'ModeBaselineMulticlassPipeline', 'BaselineMulticlassPipeline'}
'BaselineRegressionPipeline', 'ModeBaselineMulticlassPipeline', 'BaselineMulticlassPipeline', 'LightGBMClassifier'}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep! Thanks

@dsherry
Copy link
Contributor

dsherry commented Sep 16, 2020

@bchen1116 this PR is still a draft, did you mean to make it a non-draft?

@bchen1116 bchen1116 marked this pull request as ready for review September 16, 2020 21:47
@dsherry
Copy link
Contributor

dsherry commented Sep 16, 2020

@bchen1116 yep you're right, I see that codecov is complaining about gen_utils.py. Because you happened to touch it, it flagged that it has sub-100% coverage.

One way to make it go away is to write a unit test which adds the missing coverage. You think you could look into that?

Copy link
Contributor

@angela97lin angela97lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed with Dylan, ideally it'd be best to figure out how to add coverage if possible so that this doesn't happen when more lines are removed/changed in gen_utils.py but otherwise LGTM.

@@ -22,6 +22,7 @@ Release Notes
* Removed duplicate `nbsphinx` dependency in `dev-requirements.txt` :pr:`1168`
* Users can now pass in any valid kwargs to all estimators :pr:`1157`
* Remove broken accessor `OneHotEncoder.get_feature_names` and unneeded base class :pr:`1179`
* Remove LightGBM Estimator from AutoML models :pr:`1186`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you make this past-tense?

@bchen1116
Copy link
Contributor Author

@dsherry @angela97lin I think in order to test the jupyter_check in gen_utils.py, we'd need to add ipython to the test_requirements.txt file?

@dsherry
Copy link
Contributor

dsherry commented Sep 17, 2020

@bchen1116 I see what you mean. Adding ipython to our unit tests would be one way to satisfy coverage. Another, arguably simpler way would be to mock it!

@patch('evalml.utils.gen_utils.get_ipython')
def test_jupyter_check_mock(mock_get_ipython):
    mock_get_ipython.return_value = True
    assert jupyter_check()
    assert mock_get_ipython.called

    mock_get_ipython.reset()
    mock_get_ipython.side_effect = NameError('BOOM')
    assert not jupyter_check()
    assert mock_get_ipython.called

    mock_get_ipython.reset()
    mock_get_ipython.side_effect = Exception('BOOM')
    with pytest.raises(Exception, match='BOOM'):
        jupyter_check()
    assert mock_get_ipython.called

def test_jupyter_check_undefined():
    assert not jupyter_check()

(or something like that :) )

@dsherry
Copy link
Contributor

dsherry commented Sep 17, 2020

@bchen1116 I made some updates to the above, because I forgot there's a try/except in that method.

Copy link
Contributor

@freddyaboulton freddyaboulton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @bchen1116 !

@bchen1116 bchen1116 merged commit 7cf63a7 into main Sep 17, 2020
This was referenced Sep 17, 2020
@freddyaboulton freddyaboulton deleted the bc_1093_lightgbm branch May 13, 2022 14:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants