Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does Xgboost do Newton boosting? #3227

Closed
Orbean opened this issue Apr 8, 2018 · 3 comments
Closed

Does Xgboost do Newton boosting? #3227

Orbean opened this issue Apr 8, 2018 · 3 comments

Comments

@Orbean
Copy link

@Orbean Orbean commented Apr 8, 2018

In this paper below"", it mentions "Newton boosting" as the boosting algorithm used by XGBoost on page 42, which need to calculate hessian matrix.

"Tree Boosting With XGBoost"
https://brage.bibsys.no/xmlui/bitstream/handle/11250/2433761/16128_FULLTEXT.pdf

As long as I know, calculating hessian matrix is computational demanding.
Is it true that Xgboost perform 2nd order taylor expansion on loss function or just 2nd order differentiation?

@yordanivanov92

This comment has been minimized.

Copy link

@yordanivanov92 yordanivanov92 commented Apr 12, 2018

I would love to see an answer to this. Also, an overall pseudocode for the xgboost package would be nice, as given for the gbm package.

@hcho3

This comment has been minimized.

Copy link
Collaborator

@hcho3 hcho3 commented May 23, 2018

calculating hessian matrix is computational demanding

XGBoost does not compute the Hessian matrix. Instead, it computes the second partial derivative of the (element-wise) loss function with respect to the predicted label:
screen shot 2018-05-23 at 12 31 30 pm
[Screenshot taken from my master's thesis]

@hcho3 hcho3 closed this May 23, 2018
@hcho3

This comment has been minimized.

Copy link
Collaborator

@hcho3 hcho3 commented May 23, 2018

@yordanivanov92 And here's a pseudo-code for gradient boosting:
screen shot 2018-05-23 at 12 38 48 pm
[Screenshot taken from my master's thesis]

@lock lock bot locked as resolved and limited conversation to collaborators Oct 25, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
3 participants
You can’t perform that action at this time.