Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Init score implementation #1778

Closed
saleepeppe opened this issue Oct 23, 2018 · 8 comments
Closed

Init score implementation #1778

saleepeppe opened this issue Oct 23, 2018 · 8 comments

Comments

@saleepeppe
Copy link

I am using lightgbm for modelling a variable which is poisson distributed.
For this variable I have a bias provided by the logarithm of "e", where "e" is one of the predictors.

If I have understood properly, I could set the init score as the logarithm of "e" to train lightgbm.
Unluckily I cannot use any init score for the predictions since the predict() method does not support the Dataset object.

I wonder how is this "init score" is implemented? And why there is no need of using it for the predictions?

Thanks in advance

@chivee
Copy link
Collaborator

chivee commented Oct 24, 2018

basically, lightGBM will train from the residual of a certain dataset in each round of training to build a weak tree model, and the init_score indicates the first round of these residual values.

practically speaking, the `init_score' was used in the scenario that you need further training your model against some dataset, and use it to represent the previous model.

@guolinke
Copy link
Collaborator

For predictions, you should add initial scores by yourself

@saleepeppe
Copy link
Author

For predictions, you should add initial scores by yourself

If I am understanding properly, the scores I obtain from the predictions of a model trained with init_score would be:
init_score + predictions = final_predictions
Is it correct?

@guolinke
Copy link
Collaborator

guolinke commented Oct 25, 2018

yeah, and you should use the raw_score in prediction. and do your own sigmoid/softmax transform if in binary/multi classifications?

@JYLFamily
Copy link

yeah, and you should use the raw_score in prediction. and do your own sigmoid/softmax transform if in binary/multi classifications?

不好意思,关于这个问题想再请教一下。
1、init_score 使用之前学习器的预测的类别还是类别对应概率?
2、final_predictions = sigmoid(lgb 的 raw_score + 之前学习器的 raw_score ) ?
谢谢。

@guolinke
Copy link
Collaborator

guolinke commented Apr 8, 2019

@JYLFamily

  1. init_score is the raw score, before any transformation.
  2. yes

@JYLFamily
Copy link

@JYLFamily

  1. init_score is the raw score, before any transformation.
  2. yes

再请问一下,lgb.predict(raw_score=True) == lgb 的 raw_score + 之前学习器的 raw_score 还是 lgb.predict(raw_score=True) == lgb 的 raw_score

十分感谢。

@guolinke
Copy link
Collaborator

guolinke commented Apr 9, 2019

@JYLFamily The prediction result doesn't include the init_score, you should add it by yourself.

@lock lock bot locked as resolved and limited conversation to collaborators Mar 11, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants