Skip to content

Conversation

topepo
Copy link
Member

@topepo topepo commented Jan 14, 2021

No description provided.

@topepo topepo merged commit 8854d7c into master Jan 15, 2021
@topepo topepo deleted the xgb-objective branch January 15, 2021 16:04
@jaredlander
Copy link

Does this have something to do with the newish warning message in xgboost about the default evaluation metric for binary:logistic being changed from 'error' to 'logloss' as shown below?

WARNING: amalgamation/../src/learner.cc:1061: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.

@juliasilge
Copy link
Member

No, I'm pretty sure not. We set the objective but I don't think we do anything with the internal evaluation metric for xgboost.

@jaredlander
Copy link

Looking at the fitted model after fitting I see this

call:
  xgboost::xgb.train(params = list(eta = 0.3, max_depth = 4, gamma = 0, 
    colsample_bytree = 1, min_child_weight = 1, subsample = 0.6, 
    objective = "binary:logistic"), data = x$data, nrounds = 100, 
    watchlist = x$watchlist, verbose = 0, nthread = 1)

It looks like you set the watchlist argument: watchlist = x$watchlist

When the watchlist is set I think {xgboost} automatically chooses an eval_metric which in turn causes that warning.

@github-actions
Copy link

github-actions bot commented Mar 6, 2021

This pull request has been automatically locked. If you believe you have found a related problem, please file a new issue (with a reprex: https://reprex.tidyverse.org) and link to this issue.

@github-actions github-actions bot locked and limited conversation to collaborators Mar 6, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants