You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to train a LGBMClassfier with init_score for rare event prediction(CTR). Close to 99% of data is 0 and 1% of data is 1.
From the documentation it is not clear for which class init_score is expected.
If I give init_score while training the model, lightgbm assumes 0 to be positive event and generate raw_score for 0.
If I don't give init_score while training the model, lightgbm assumes 1 to be positive event and generate raw_score for 1.
Description
I am trying to train a LGBMClassfier with init_score for rare event prediction(CTR). Close to 99% of data is 0 and 1% of data is 1.
From the documentation it is not clear for which class init_score is expected.
If I give init_score while training the model, lightgbm assumes 0 to be positive event and generate raw_score for 0.
If I don't give init_score while training the model, lightgbm assumes 1 to be positive event and generate raw_score for 1.
Reproducible example
params['metric'] = 'binary_logloss'
clf4 = lgb.LGBMClassifier(n_jobs=-1)
clf4.fit(X=df[catf + numf], y = df.clicks, sample_weight=df.weight.values, categorical_feature=catf, init_score = init_score)
Environment info
LightGBM version or commit hash: Lightgbm version: '4.3.0'
Is there any way to force Lightgbm to generate the score for 1 instead of 0.
The text was updated successfully, but these errors were encountered: