New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MNT Use check_scalar in AdaBoostRegressor #21605
MNT Use check_scalar in AdaBoostRegressor #21605
Conversation
I noticed the # Check parameters
if self.learning_rate <= 0:
raise ValueError("learning_rate must be greater than zero") Should I delete this one? f"learning_rate == {self.learning_rate}, must be > 0." |
sklearn/ensemble/_weight_boosting.py
Outdated
check_scalar( | ||
self.n_estimators, | ||
"n_estimators", | ||
target_type=numbers.Integral, | ||
min_val=1, | ||
include_boundaries="left", | ||
) | ||
|
||
check_scalar( | ||
self.learning_rate, | ||
"learning_rate", | ||
target_type=numbers.Real, | ||
min_val=0, | ||
include_boundaries="neither", | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should I delete this one?
We can move both of these checks into BaseWeightBoosting
, and replace the check for self.learning_rate
:
scikit-learn/sklearn/ensemble/_weight_boosting.py
Lines 115 to 116 in c9e5067
if self.learning_rate <= 0: | |
raise ValueError("learning_rate must be greater than zero") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you. In that case I will remove the checks I added to AdaBoostClassifier
too. They are the exact same checks as what I am about to move into BaseWeightBoosting
.
…nto AdaBoostRegressor_add_check_scaler
@genvalen After discussion with @glemaitre, this PR should be prefixed with "MAINT" (and yes, confirming it does mean for "maintenance") |
Okay! So just to be clear, the proper abbreviation is "MAINT" and not "MNT"? Or is either one fine? |
…nto AdaBoostRegressor_add_check_scaler
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All is good for me. LGTM.
Thanks @genvalen
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for following up!
Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@glemaitre @thomasjpfan @jjerphan @ogrisel
|
Thanks for following up, @reshamas. 1., 2., 3. look good to me. I would also make sure error messages in tests are present. |
@jjerphan
|
reg = AdaBoostRegressor(loss="foo") | ||
with pytest.raises(ValueError): | ||
reg.fit(X, y_class) | ||
|
||
clf = AdaBoostClassifier(algorithm="foo") | ||
with pytest.raises(ValueError): | ||
clf.fit(X, y_class) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@reshamas: I should have been clearer.
Here for instance, there's a check for ValueErrors
being raised but their error messages aren't checked.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we re-open this one? Error messages are important and helpful, correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here we go: #22144
@reshamas Noted. Thanks! I'll incorporate these consistency points moving forward. |
Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
Reference Issues/PRs
Addresses #20724
#DataUmbrella
What does this implement/fix? Explain your changes.
Summary of changes to
AdaBoostRegressor
:check_scalar
fromsklearn.utils
to validate the scalar parameters.Test and validation progress:
References
Any other comments?