Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calling XGBModel.fit() should clear the Booster by default #6562

Merged
merged 4 commits into from
Dec 31, 2020
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
18 changes: 15 additions & 3 deletions python-package/xgboost/sklearn.py
Original file line number Diff line number Diff line change
Expand Up @@ -501,8 +501,9 @@ def _configure_fit(
eval_metric: Optional[Union[Callable, str, List[str]]],
params: Dict[str, Any],
) -> Tuple[Booster, Optional[Union[Callable, str, List[str]]], Dict[str, Any]]:
model = self._Booster if hasattr(self, "_Booster") else None
model = booster if booster is not None else model
model = booster
if hasattr(model, '_Booster'):
model = model._Booster # Handle the case when xgb_model is a sklearn model object
feval = eval_metric if callable(eval_metric) else None
if eval_metric is not None:
if callable(eval_metric):
Expand All @@ -518,7 +519,11 @@ def fit(self, X, y, *, sample_weight=None, base_margin=None,
feature_weights=None,
callbacks=None):
# pylint: disable=invalid-name,attribute-defined-outside-init
"""Fit gradient boosting model
"""Fit gradient boosting model.

Note that calling ``fit()`` multiple times will cause the model object to be re-fit from
scratch. To resume training from a previous checkpoint, explicitly pass ``xgb_model``
argument.

Parameters
----------
Expand Down Expand Up @@ -1212,6 +1217,10 @@ def fit(self, X, y, *, group, sample_weight=None, base_margin=None,
# pylint: disable = attribute-defined-outside-init,arguments-differ
"""Fit gradient boosting ranker

Note that calling ``fit()`` multiple times will cause the model object to be re-fit from
scratch. To resume training from a previous checkpoint, explicitly pass ``xgb_model``
argument.

Parameters
----------
X : array_like
Expand Down Expand Up @@ -1322,6 +1331,9 @@ def fit(self, X, y, *, group, sample_weight=None, base_margin=None,
raise ValueError(
'Custom evaluation metric is not yet supported for XGBRanker.')
params.update({'eval_metric': eval_metric})
if hasattr(xgb_model, '_Booster'):
# Handle the case when xgb_model is a sklearn model object
xgb_model = xgb_model._Booster

self._Booster = train(params, train_dmatrix,
self.n_estimators,
Expand Down