New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can we initialize xgboost with the outputs of other classifiers? #27
Comments
This is an interesting question, currently it is not possible. Though usually you can put the prediction of randomforest as one feature column. |
OK, I am adding this enhancement in the new version. Although the new version is not yet stable for release, you can try it in unity branch. https://github.com/tqchen/xgboost/tree/unity |
Great ! the demo.py is great (although only works in python2, should be ported to python3 anyway). I will analysis different initialization ways, one is using the max_depth=2 boost tree, another maybe using randomForest. |
I didn't have python2 on my machine so I don't know what is the exactly issue there. But I suppose it could be made compatible. As noted in the comment, remember to put margin in, so if you are using logistic loss, and your prediction is probability, you need to inverse logistic transform it before putting it into set_base_margin |
the new code is now merged into master |
Hi, how can I initialize xgboost with the outputs of other classifiers, for example, initialize xgboost with randomForest ?
Liu
The text was updated successfully, but these errors were encountered: