-
-
Notifications
You must be signed in to change notification settings - Fork 25.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FIX fix multinomial deviance by taking the weighted average instead of the sum #17694
Conversation
Conflicts: sklearn/ensemble/_gb.py sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Thanks @t-kusanagi2 for taking care of that. @glemaitre you reviewed the original PR, do you mind checking if your comments have been addressed? Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We will need an entry in whats new for 0.24
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
I added to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am happy with it. We will need a second reviewer.
Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here are a few comments.
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
…nagi2/scikit-learn into multinomial-deviance-mean
sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Whenever I read this code I am not sure that fitting weighted regression to predict the unweighted deviance gradient + the leaf value fix-up is equivalent to fitting an weighted tree on the weighted deviance.
At some point we should write some tests to check this but this is outside of the scope of this PR.
Thanks @t-kusanagi2 |
…f the sum (scikit-learn#17694) Co-authored-by: Markus Rempfler <markus.rempfler@tum.de> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com> Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
…f the sum (scikit-learn#17694) Co-authored-by: Markus Rempfler <markus.rempfler@tum.de> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com> Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
…f the sum (scikit-learn#17694) Co-authored-by: Markus Rempfler <markus.rempfler@tum.de> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com> Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
Reference Issues/PRs
Fixes #10055
Closes #10081
What does this implement/fix? Explain your changes.
Changed MultinomialDeviance from total logloss to average logloss.
Any other comments?
I fixed stalled PR #10081. The main modification is from #10081.
What I did is to merge recently master branch and fix conflicts and flake8 errors.