Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suppress LightGBM Warning #1157

Closed
startakovsky opened this issue Dec 30, 2017 · 26 comments · Fixed by #1628
Closed

Suppress LightGBM Warning #1157

startakovsky opened this issue Dec 30, 2017 · 26 comments · Fixed by #1628
Labels

Comments

@startakovsky
Copy link

The following line is replicated throughout the training for each iteration and it doesn't seem to be generating through Python's standard (warnings)[https://docs.python.org/3/library/warnings.html] module:

[LightGBM] [Warning] No further splits with positive gain, best gain: -inf

How can I suppress this warning? It is generated here. Also, if possible, can you tell me the meaning of this warning?

@guolinke
Copy link
Collaborator

it means:

  1. the num_leaves is too large, you can set it to a smaller value
  2. the min_data is too large
  3. your data is hard to fit

@startakovsky
Copy link
Author

thanks, how do you suppress these warnings and keep reporting the validation metrics using verbose_eval?

@guolinke
Copy link
Collaborator

guolinke commented Jun 7, 2018

for sklearn interface, you can set verbose=-1 when defining the model (not in fit).
for lgb.train interface, you can set verbose=-1 in param dict.

@prashant-kikani
Copy link

prashant-kikani commented Jun 23, 2018

@guolinke what abour lgb.cv ? Can I suppress this warning in lgb.cv?
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf

@guolinke
Copy link
Collaborator

I think it can use in cv as well

@prashant-kikani
Copy link

prashant-kikani commented Jun 23, 2018 via email

@guolinke
Copy link
Collaborator

set it in param dict, not the function arguments.

@hoihui
Copy link

hoihui commented Jun 28, 2018

setting 'verbose' or 'verbosity' to -1 in the param dict solves this problem for lgb.train,
but it does not help for lgb.cv, or for lgb.train with continued training (i.e. with init_model)

@guolinke
Copy link
Collaborator

ping @StrikerRUS

@StrikerRUS
Copy link
Collaborator

StrikerRUS commented Jun 28, 2018

@guolinke I confirm that with init_model setting verbose=-1 doesn't work. However, in cv verbose=-1 works well for me.

UPD:
It seems that problem persists only in case of both init_model and valid_sets are specified:

Logs are shown:

import lightgbm as lgb
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split

X, y = load_boston(True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)
params = {'verbose': -1}

lgb_train = lgb.Dataset(X_train, y_train, params=params, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params=params, free_raw_data=False)
init_gbm = lgb.train(params, lgb_eval)
gbm = lgb.train(params, lgb_train,
                valid_sets=lgb_eval,
                verbose_eval=False,
                init_model=init_gbm)

No logs:

import lightgbm as lgb
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split

X, y = load_boston(True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)
params = {'verbose': -1}

lgb_train = lgb.Dataset(X_train, y_train, params=params, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params=params, free_raw_data=False)
init_gbm = lgb.train(params, lgb_eval)
gbm = lgb.train(params, lgb_train,
#                 valid_sets=lgb_eval,
                verbose_eval=False,
                init_model=init_gbm)

Also, instance of valid_sets is matter (the same as training set):
no logs:

import lightgbm as lgb
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split

X, y = load_boston(True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)
params = {'verbose': -1}

lgb_train = lgb.Dataset(X_train, y_train, params=params, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params=params, free_raw_data=False)
init_gbm = lgb.train(params, lgb_eval)
gbm = lgb.train(params, lgb_train,
                valid_sets=lgb_train,  # <-------
                verbose_eval=False,
                init_model=init_gbm)

@guolinke
Copy link
Collaborator

guolinke commented Jun 29, 2018

@StrikerRUS
what logs are shown ?
Including the training information?

I think we need to pass verbose parameter when creating _InnerPredictor:
https://github.com/Microsoft/LightGBM/blob/master/python-package/lightgbm/engine.py#L113 .

@StrikerRUS
Copy link
Collaborator

StrikerRUS commented Jun 29, 2018

@guolinke

what logs are shown ?

Only warnings:

[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf

In the example init_model is a Booster, so this line?
https://github.com/Microsoft/LightGBM/blob/c86fe61b7e1fa15d0cdd8e69beef269660ec88c8/python-package/lightgbm/engine.py#L115

@goldentom42
Copy link

Hi there, I copy the content of issue #1486 as requested by @StrikerRUS

Environment info

Operating System: Linux and windows (can't test on Apple)
C++/Python/R version: latest version (I believe Kaggle uses the latest master branch), occurs also on 2.1.0

Reproducible examples

https://www.kaggle.com/ogrellier/lighgbm-with-selected-features

The problem only occurs with lgb.train (LGBClassifier does not exhibit the same issue) and only if eval_sets argument is provided.

image

Let me know if you need any further info.
Thanks, Olivier

@goldentom42
Copy link

Just to recap :

  • Sklearn API works correctly
  • lgb.train has the problem if an eval_sets is provided whatever verbose, verbosity or verbose_eval

Hope this helps.

@pseudotensor
Copy link

pseudotensor commented Jul 19, 2018

[LightGBM] [Warning] boosting is set=gbdt, boosting_type=gbdt will be ignored. Current value: boosting=gbdt
[LightGBM] [Warning] num_threads is set=4, nthread=-1 will be ignored. Current value: num_threads=4

With the sklearn API I still get during fit the above types of messages even of False is passed to fit() and verbose=-1 passed as dict into the initial class part of a dict.

Note this is even though I am not passing (ever) boosting_type or nthread.

But if I do model.get_params() I see those extra 2 parameters listed there even though I never passed them. So I assume sklearn API is adding them and then lgbm complains.

@StrikerRUS
Copy link
Collaborator

@pseudotensor These parameters are regular arguments of the constructor, you should use them instead of aliases in params.

@pseudotensor
Copy link

yes thanks, just bit odd these main sklearn ones are listed as aliases

@StrikerRUS
Copy link
Collaborator

StrikerRUS commented Aug 30, 2018

I will really appreciate if everyone here will test the fix proposed in #1628 (verbose branch) and report here whether it helped in your case.

@goldentom42 Speaking personally about your case, passing params={'verbose': -1} in Dataset constructor (UPD: and removing silent=True) should help even without the fix.

@goldentom42
Copy link

Sorry but I'm not sure I understand your statement: I should remove silent=True in the lgb.Dataset() call and set verbose: - 1 when calling lgb.train ?

@StrikerRUS
Copy link
Collaborator

@goldentom42 Please see the example:

lgb_train = lgb.Dataset(X_train, y_train, params={'verbose': -1}, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params={'verbose': -1}, free_raw_data=False)
gbm = lgb.train({'verbose': -1}, lgb_train,
                valid_sets=lgb_eval,
                verbose_eval=False)

verbose=-1 in both Dataset constructor and train function.

@goldentom42
Copy link

Hi @StrikerRUS, tested LightGBM on Kaggle (they would normally have the latest version) and I don't see the warnings anymore with verbose : -1 in params.

On LightGBM 2.1.2, setting verbose to -1 in both Dataset and lightgbm params make warnings disappear.

Hope this helps.

@StrikerRUS
Copy link
Collaborator

@goldentom42 Thanks for your reply! It seems to be that Kaggle uses the latest available release at PyPI, not the master branch:
https://github.com/Kaggle/docker-python/blob/cd1e6ac7d076775af2a5bfbcb65bfd98ea6629de/Dockerfile#L78

We are going to merge the fix into the master branch and close this issue. So, the latest code should work fine with either silent=True argument, or 'verbose': -1 parameter.
However, feel free to report any cases which left uncovered by the fix.

@joseortiz3
Copy link

joseortiz3 commented Oct 24, 2018

I still get
[LightGBM] [Warning] num_threads is set=1, n_jobs=-1 will be ignored. Current value: num_threads=1
because of conflicting options. Not sure why certain parameters override other parameters and warn you about it.

@guolinke
Copy link
Collaborator

@joseortiz3 We use alphabet order to override the parameters, therefore, num_therads will override n_jobs

@saksim
Copy link

saksim commented Nov 30, 2018

thanks for the message in this issue.

In a word, Could we understand it as the model would be run OK although such message printed out.

@iggisv9t
Copy link

image
verbose=-1. I also tried all mentioned above and all that I found in docs. It's too annoying.

@lock lock bot locked as resolved and limited conversation to collaborators Mar 10, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants