Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we initialize xgboost with the outputs of other classifiers? #27

Closed
liubenyuan opened this issue Aug 16, 2014 · 5 comments
Closed

Comments

@liubenyuan
Copy link

Hi, how can I initialize xgboost with the outputs of other classifiers, for example, initialize xgboost with randomForest ?

Liu

@tqchen
Copy link
Member

tqchen commented Aug 16, 2014

This is an interesting question, currently it is not possible. Though usually you can put the prediction of randomforest as one feature column.

@tqchen
Copy link
Member

tqchen commented Aug 18, 2014

OK, I am adding this enhancement in the new version. Although the new version is not yet stable for release, you can try it in unity branch. https://github.com/tqchen/xgboost/tree/unity
Example script in https://github.com/tqchen/xgboost/blob/unity/python/example/demo.py#L98

@liubenyuan
Copy link
Author

Great ! the demo.py is great (although only works in python2, should be ported to python3 anyway). I will analysis different initialization ways, one is using the max_depth=2 boost tree, another maybe using randomForest.

@tqchen
Copy link
Member

tqchen commented Aug 19, 2014

I didn't have python2 on my machine so I don't know what is the exactly issue there. But I suppose it could be made compatible. As noted in the comment, remember to put margin in, so if you are using logistic loss, and your prediction is probability, you need to inverse logistic transform it before putting it into set_base_margin

@tqchen
Copy link
Member

tqchen commented Aug 23, 2014

the new code is now merged into master

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants