-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
remove lightgbm from automl #1186
Conversation
Codecov Report
@@ Coverage Diff @@
## main #1186 +/- ##
==========================================
- Coverage 99.92% 99.92% -0.01%
==========================================
Files 196 196
Lines 11814 11813 -1
==========================================
- Hits 11805 11804 -1
Misses 9 9
Continue to review full report at Codecov.
|
Codecov/project not passing, likely due to the untested line in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👍
@@ -155,7 +155,7 @@ def _get_subclasses(base_class): | |||
|
|||
_not_used_in_automl = {'BaselineClassifier', 'BaselineRegressor', | |||
'ModeBaselineBinaryPipeline', 'BaselineBinaryPipeline', 'MeanBaselineRegressionPipeline', | |||
'BaselineRegressionPipeline', 'ModeBaselineMulticlassPipeline', 'BaselineMulticlassPipeline'} | |||
'BaselineRegressionPipeline', 'ModeBaselineMulticlassPipeline', 'BaselineMulticlassPipeline', 'LightGBMClassifier'} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep! Thanks
@bchen1116 this PR is still a draft, did you mean to make it a non-draft? |
@bchen1116 yep you're right, I see that codecov is complaining about One way to make it go away is to write a unit test which adds the missing coverage. You think you could look into that? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed with Dylan, ideally it'd be best to figure out how to add coverage if possible so that this doesn't happen when more lines are removed/changed in gen_utils.py
but otherwise LGTM.
docs/source/release_notes.rst
Outdated
@@ -22,6 +22,7 @@ Release Notes | |||
* Removed duplicate `nbsphinx` dependency in `dev-requirements.txt` :pr:`1168` | |||
* Users can now pass in any valid kwargs to all estimators :pr:`1157` | |||
* Remove broken accessor `OneHotEncoder.get_feature_names` and unneeded base class :pr:`1179` | |||
* Remove LightGBM Estimator from AutoML models :pr:`1186` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you make this past-tense?
@dsherry @angela97lin I think in order to test the |
@bchen1116 I see what you mean. Adding @patch('evalml.utils.gen_utils.get_ipython')
def test_jupyter_check_mock(mock_get_ipython):
mock_get_ipython.return_value = True
assert jupyter_check()
assert mock_get_ipython.called
mock_get_ipython.reset()
mock_get_ipython.side_effect = NameError('BOOM')
assert not jupyter_check()
assert mock_get_ipython.called
mock_get_ipython.reset()
mock_get_ipython.side_effect = Exception('BOOM')
with pytest.raises(Exception, match='BOOM'):
jupyter_check()
assert mock_get_ipython.called
def test_jupyter_check_undefined():
assert not jupyter_check() (or something like that :) ) |
@bchen1116 I made some updates to the above, because I forgot there's a try/except in that method. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @bchen1116 !
Remove LightGBM from AutoML models for now, as it was added without proper performance tests.