Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sklearn remove warning, there is no verbose_eval #1186

Closed
startakovsky opened this issue Jan 8, 2018 · 13 comments
Closed

sklearn remove warning, there is no verbose_eval #1186

startakovsky opened this issue Jan 8, 2018 · 13 comments

Comments

@startakovsky
Copy link

Is there any way to remove warnings in the sklearn API? The fit function only takes verbose which seems to only toggle the display of the per iteration details.

[LightGBM] [Warning] min_data_in_leaf is set=74, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=74.

@StrikerRUS
Copy link
Collaborator

You are trying to set one parameter twice (or you are using an alias of parameter which shouldn't be passed via **kwargs). Please pay attention to the aliases in docs: https://lightgbm.readthedocs.io/en/latest/Parameters.html#learning-control-parameters

@startakovsky
Copy link
Author

Thanks @StrikerRUS , you seem to be responding specifically to this warning. Here is another warning: [LightGBM] [Warning] No further splits with positive gain, best gain: -inf.

I am seeing this line appear too many times and other lines during training and I am interested in completely suppressing this warning.

These warnings are generated from the C++ code, and doesn't seem to be suppressed by filtering warnings from Python's warnings library.

I have tried setting verbose=-1, but it is not suppressing the warning.

Any advice, hacks or ways I might be able to suppress all warnings?

@StrikerRUS
Copy link
Collaborator

@startakovsky Yeah, I was talking about that concrete warning.

Now ping @guolinke because you've already discussed this with him in #1157.

@startakovsky
Copy link
Author

thanks @StrikerRUS

@guolinke how do I suppress the C++ generated warnings in the sklearn API?
#1157

@guolinke guolinke closed this as completed Jan 9, 2018
@tnmichael309
Copy link

import warnings
warnings.filterwarnings('ignore')

@AdamsMi
Copy link

AdamsMi commented Jun 6, 2018

doesn't work (the filter warnings hack). This is really annoying, this warning. Can anybody finally tell us how to get rid of this message?

@StrikerRUS
Copy link
Collaborator

@AdamsMi filtering warnings cannot work in this case because warnings come directly from C++ code. Passing verbose=-1 or verbose_eval=-1 doesn't work?

@guolinke
Copy link
Collaborator

guolinke commented Jun 7, 2018

for sklearn interface, you can set verbose=-1 when defining the model (not in fit).
for lgb.train interface, you can set verbose=-1 in param dict.

@tnmichael309
Copy link

Sadly... I use warining filter along with @guolinke mentioned and it works.
In my case, my warning came from resetting categorical features.
I'm using jupyter notebook on windows.

@tnmichael309
Copy link

Maybe I used notebook and the warning was not shown on it but console only.

@marcioribp
Copy link

I am having the same problem @tnmichael309.

[LightGBM] [Warning] Met categorical feature which contains sparse values. Consider renumbering to consecutive integers started from zero

Tried everything suggested and nothing works, this warning is haunting me.

Using lgb interface on linux console.

@StrikerRUS
Copy link
Collaborator

@marcioribp Please refer to https://lightgbm.readthedocs.io/en/latest/Advanced-Topics.html#categorical-feature-support, #1636

Categorical features must be encoded as non-negative integers (int) less than Int32.MaxValue (2147483647). It is best to use a contiguous range of integers started from zero.

#1920 (comment)

If you don't have memory issues, you can ignore this.

@marcioribp
Copy link

@marcioribp Please refer to https://lightgbm.readthedocs.io/en/latest/Advanced-Topics.html#categorical-feature-support, #1636

Categorical features must be encoded as non-negative integers (int) less than Int32.MaxValue (2147483647). It is best to use a contiguous range of integers started from zero.

#1920 (comment)

If you don't have memory issues, you can ignore this.

@StrikerRUS
Thank you very much! As it turns out, I had to convert my category columns to int 32 as mentioned in the documentation page.

@lock lock bot locked as resolved and limited conversation to collaborators Mar 10, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants