New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[dask] Complete features in regressor and classifier. #6471
Conversation
trivialfis
commented
Dec 5, 2020
- Add eval_metric.
- Add custom objective.
- Add callback.
- Add feature weights.
Is this blocking? |
No. It's a PR for reaching feature parity with native model. |
There are still some features like |
FYI ntree_limit = 0 usually has to be passed for gblinear during predict or one gets an assertion hit, so if no ntree_limit support is done it would have to automaticlally pass that properly. |
@pseudotensor gblinear is not used in distributed setting. |
@pseudotensor We added a new feature for slicing tree model: booster[0: 3] will get the trees in [0, 3) boosting rounds. See https://xgboost.readthedocs.io/en/latest/python/model.html . |
Codecov Report
@@ Coverage Diff @@
## master #6471 +/- ##
==========================================
- Coverage 80.13% 80.09% -0.05%
==========================================
Files 13 13
Lines 3513 3531 +18
==========================================
+ Hits 2815 2828 +13
- Misses 698 703 +5
Continue to review full report at Codecov.
|
4d57fee
to
a94ff23
Compare
* Add eval_metric. * Add callback. * Add feature weights. * Add custom objective.
a94ff23
to
3ee7c0f
Compare
@hcho3 @RAMitchell please help reviewing. |