Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

automl.refit() XGBoostError: base_score must be in (0,1) for logistic loss #189

Closed
zygmuntz opened this issue Nov 11, 2016 · 2 comments
Closed

Comments

@zygmuntz
Copy link

I get the following error when running refit( x_train, y_train ):

In [18]: automl.refit( x_train, y_train )
---------------------------------------------------------------------------
XGBoostError                              Traceback (most recent call last)
<ipython-input-18-16f19027783b> in <module>()
----> 1 automl.refit( x_train, y_train )

/usr/local/anaconda/lib/python2.7/site-packages/autosklearn/estimators.pyc in refit(self, X, y)
	46 
	47         """
---> 48         return self._automl.refit(X, y)
	49 
	50     def fit_ensemble(self, y, task=None, metric=None, precision='32',

/usr/local/anaconda/lib/python2.7/site-packages/autosklearn/estimators.pyc in refit(self, X, y)
	46 
	47         """
---> 48         return self._automl.refit(X, y)
	49 
	50     def fit_ensemble(self, y, task=None, metric=None, precision='32',

/usr/local/anaconda/lib/python2.7/site-packages/autosklearn/automl.pyc in refit(self, X, y)
	424                 # this updates the model inplace, it can then later be used in
	425                 # predict method
--> 426                 model.fit(X.copy(), y.copy())
	427 
	428         self._can_predict = True

/usr/local/anaconda/lib/python2.7/site-packages/autosklearn/pipeline/base.pyc in fit(self, X, y, fit_params, init_params)
	61         X, fit_params = self.pre_transform(X, y, fit_params=fit_params,
	62                                           init_params=init_params)
---> 63         self.fit_estimator(X, y, **fit_params)
	64         return self
	65 

/usr/local/anaconda/lib/python2.7/site-packages/autosklearn/pipeline/base.pyc in fit_estimator(self, X, y, **fit_params)
	136         if fit_params is None:
	137             fit_params = {}
--> 138         self.pipeline_.steps[-1][-1].fit(X, y, **fit_params)
	139         return self
	140 

/usr/local/anaconda/lib/python2.7/site-packages/autosklearn/pipeline/components/classification/xgradient_boosting.pyc in fit(self, X, y)
	127                 seed=self.seed
	128                 )
--> 129         self.estimator.fit(X, y)
	130 
	131         return self

/usr/local/anaconda/lib/python2.7/site-packages/xgboost/sklearn.pyc in fit(self, X, y, sample_weight, eval_set, eval_metric, early_stopping_rounds, verbose)
	341                               early_stopping_rounds=early_stopping_rounds,
	342                               evals_result=evals_result, feval=feval,
--> 343                               verbose_eval=verbose)
	344 
	345         if evals_result:

/usr/local/anaconda/lib/python2.7/site-packages/xgboost/training.pyc in train(params, dtrain, num_boost_round, evals, obj, feval, maximize, early_stopping_rounds, evals_result, verbose_eval, learning_rates, xgb_model)
	119     if not early_stopping_rounds:
	120         for i in range(num_boost_round):
--> 121             bst.update(dtrain, i, obj)
	122             nboost += 1
	123             if len(evals) != 0:

/usr/local/anaconda/lib/python2.7/site-packages/xgboost/core.pyc in update(self, dtrain, iteration, fobj)
	692 
	693         if fobj is None:
--> 694             _check_call(_LIB.XGBoosterUpdateOneIter(self.handle, iteration, dtrain.handle))
	695         else:
	696             pred = self.predict(dtrain)

/usr/local/anaconda/lib/python2.7/site-packages/xgboost/core.pyc in _check_call(ret)
	95     """
	96     if ret != 0:
---> 97         raise XGBoostError(_LIB.XGBGetLastError())
	98 
	99 

XGBoostError: base_score must be in (0,1) for logistic loss
@mfeurer
Copy link
Contributor

mfeurer commented Nov 15, 2016

This is related to #163. Although I don't know what one can do if boosting fails with such an error. Is this issue reproducible?

@mfeurer
Copy link
Contributor

mfeurer commented May 9, 2017

Closing this because XGBoost will be removed from auto-sklearn due to issue #271.

@mfeurer mfeurer closed this as completed May 9, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants