New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sign error in QuantileLossFunction #8087

Closed
ahyoussef opened this Issue Dec 19, 2016 · 3 comments

Comments

Projects
None yet
3 participants
@ahyoussef

ahyoussef commented Dec 19, 2016

Description

The QuantileLossFunction in sklearn/ensemble/gradient_boosting.py has a sign error, making it take negative values for certain inputs.

@pprett @amueller

Steps/Code to Reproduce

  • You can consult formula (2) in http://avesbiodiv.mncn.csic.es/estadistica/curso2011/qr4.pdf for the correct implementation.

  • The peculiar thing about the quantile loss function is that it is asymmetric, i.e $Loss(y, pred)$ is different from $Loss(pred, y)$, but of course it has to be positive for all inputs. See graph on top of page 1005 of the previous article for a graph of the loss function.

  • The error has no dramatic consequence because the negative gradient is correctly implemented.

  • This issue has been raised several times already, corrected, but never merged (#6133 and #6429).

  • All what needs to be done is to replace + (1.0 - alpha) diff[~mask] by - (1.0 - alpha) diff[~mask] in lines 422 and 425.

Here is the code generating the wrong behavior

from sklearn.ensemble.gradient_boosting import QuantileLossFunction
import numpy as np

x = np.array([-1])
print(QuantileLossFunction(1, 0.5)(x, np.array([0])))

Expected Results

0.5

Actual Results

-0.5

Versions

Darwin-16.3.0-x86_64-i386-64bit
Python 3.5.2 |Anaconda custom (x86_64)| (default, Jul 2 2016, 17:52:12)
[GCC 4.2.1 Compatible Apple LLVM 4.2 (clang-425.0.28)]
NumPy 1.11.2
SciPy 0.18.1
Scikit-Learn 0.19.dev0

@agramfort

This comment has been minimized.

Show comment
Hide comment
@agramfort

agramfort Dec 21, 2016

Member

it was not merged because the PRs did not contain any test.
please finish the work and we'll be happy to merge.

Member

agramfort commented Dec 21, 2016

it was not merged because the PRs did not contain any test.
please finish the work and we'll be happy to merge.

@devanshdalal

This comment has been minimized.

Show comment
Hide comment
@devanshdalal

devanshdalal Dec 27, 2016

Contributor

this should be closed if works is done now.

Contributor

devanshdalal commented Dec 27, 2016

this should be closed if works is done now.

@ahyoussef

This comment has been minimized.

Show comment
Hide comment
@ahyoussef

ahyoussef Dec 27, 2016

All looks fine indeed.

ahyoussef commented Dec 27, 2016

All looks fine indeed.

@ahyoussef ahyoussef closed this Dec 27, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment