Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[python-package] verbose does not supress warnings with custom objectives #6014

Open
lbittarello opened this issue Jul 28, 2023 · 9 comments · May be fixed by #6428
Open

[python-package] verbose does not supress warnings with custom objectives #6014

lbittarello opened this issue Jul 28, 2023 · 9 comments · May be fixed by #6428
Assignees
Labels

Comments

@lbittarello
Copy link

Description

In the Python package of LightGBM 4.0.0, setting the verbose parameter to -1 does not suppress warnings if the objective is a user-defined function.

Reproducible example

import numpy as np
import pandas as pd

from lightgbm import LGBMRegressor

df = pd.DataFrame({"x": [0, 0, 1, 1, 1], "y": [0, 1, 0, 1, 1]})


def l2_obj(y_true, y_pred):
    grad = y_pred - y_true
    hess = np.ones_like(y_pred)
    return grad, hess


estimator = LGBMRegressor(
    objective=l2_obj, min_child_samples=1, n_estimators=1, n_jobs=1, verbose=-1
)

estimator.fit(df[["x"]], df["y"])

Environment info

LightGBM version or commit hash: 4.0.0

Command(s) you used to install LightGBM

conda install -y lightgbm
@jameslamb jameslamb changed the title verbose does not supress warnings with custom objectives [python-package] verbose does not supress warnings with custom objectives Jul 28, 2023
@celestinoxp
Copy link

is this related to pycaret/pycaret#3660 ? I installed lightgbm via pip...

@GidonKR
Copy link

GidonKR commented Aug 21, 2023

I've also encountered with this issue and after trying every solution out there
i found that you can create custom logger, something like this:

class CustomLogger:
def init(self):
self.logger = logging.getLogger('lightgbm_custom')
self.logger.setLevel(logging.ERROR)

def info(self, message):
    self.logger.info(message)

def warning(self, message):
    # Suppress warnings by not doing anything
    pass

def error(self, message):
    self.logger.error(message)

import lightgbm as lgbm
lgbm.register_logger(CustomLogger())

@Tialo
Copy link

Tialo commented Aug 23, 2023

Also encounter it when using lightgbm.cv. But supresses warnings when using lightgbm.train

@Tialo
Copy link

Tialo commented Aug 23, 2023

GidonKR, thanks for temporary solution!

@jefferythewind
Copy link

From @GidonKR here is some code that worked for me in Python 3.12

import logging


class CustomLogger:
    def init(self):
        self.logger = logging.getLogger('lightgbm_custom')
        self.logger.setLevel(logging.ERROR)

    def info(self, message):
        self.logger.info(message)

    def warning(self, message):
        # Suppress warnings by not doing anything
        pass

    def error(self, message):
        self.logger.error(message)
l = CustomLogger()
l.init()
lgb.register_logger(l)

@jameslamb
Copy link
Collaborator

jameslamb commented Apr 25, 2024

Thanks for using LightGBM and for putting the effort into a minimal, reproducible example!

I'm able to reproduce this with the latest lightgbm built from source here (1443548).

import numpy as np
import pandas as pd

from lightgbm import LGBMRegressor

df = pd.DataFrame({"x": [0, 0, 1, 1, 1], "y": [0, 1, 0, 1, 1]})


def l2_obj(y_true, y_pred):
    grad = y_pred - y_true
    hess = np.ones_like(y_pred)
    return grad, hess


params = {
    "min_child_samples": 1,
    "n_estimators": 1,
    "n_jobs": 1,
    "verbose": -1
}

LGBMRegressor(
    **{**params, "objective": l2_obj}
).fit(df[["x"]], df["y"])

# [LightGBM] [Info] Using self-defined objective function
# [LightGBM] [Warning] No further splits with positive gain, best gain: -inf

LGBMRegressor(
    **{**params, "objective": "regression"}
).fit(df[["x"]], df["y"])

# (no logs)

This is definitely a bug. I'll investigate and hopefully put up a fix shortly.

@jameslamb
Copy link
Collaborator

And very sorry for the delayed response. This project has a very small number of maintainers relative to its popularity, and 0 who work on LightGBM maintenance full-time.

If any of you involved in this thread have feedback on what we could do to make it more likely that you'll investigate such issues and contribute fixes yourself in the future, we'd love to hear them here or over in #6350.

@jameslamb
Copy link
Collaborator

Looks like this behavior is not specific to the scikit-learn interface, and happens with lgb.train() too.

import numpy as np
import pandas as pd

import lightgbm as lgb

df = pd.DataFrame({"x": [0, 0, 1, 1, 1], "y": [0, 1, 0, 1, 1]})

def l2_obj(y_pred, train_data):
    y_true = train_data.get_label()
    grad = y_pred - y_true
    hess = np.ones_like(y_pred)
    return grad, hess

params = {
    "min_child_samples": 1,
    "n_estimators": 1,
    "n_jobs": 1,
    "verbosity": -1
}

lgb.train(
    params={
        **params,
        "objective": l2_obj
    },
    train_set=lgb.Dataset(df[["x"]], label=df[["y"]])
)

lgb.train(
    params={
        **params,
        "objective": "regression"
    },
    train_set=lgb.Dataset(df[["x"]], label=df[["y"]])
)

@jameslamb
Copy link
Collaborator

I believe I've found the root cause.

This call in Booster.update() (the function that runs 1 boosting round)

if not self.__set_objective_to_none:
self.reset_parameter({"objective": "none"}).__set_objective_to_none = True

Calls LGBM_BoosterResetParameter() in the C API.

LightGBM/src/c_api.cpp

Lines 2035 to 2040 in 1443548

int LGBM_BoosterResetParameter(BoosterHandle handle, const char* parameters) {
API_BEGIN();
Booster* ref_booster = reinterpret_cast<Booster*>(handle);
ref_booster->ResetConfig(parameters);
API_END();
}

Which eventually calls Config::SetVerbosity() on the C++ side... which reset the verbosity level back to INFO if "verbose" or "verbosity" aren't found in the parameters passed to it.

void Config::SetVerbosity(const std::unordered_map<std::string, std::vector<std::string>>& params) {
int verbosity = Config().verbosity;
GetFirstValueAsInt(params, "verbose", &verbosity);
GetFirstValueAsInt(params, "verbosity", &verbosity);
if (verbosity < 0) {
LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Fatal);
} else if (verbosity == 0) {
LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Warning);
} else if (verbosity == 1) {
LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Info);
} else {
LightGBM::Log::ResetLogLevel(LightGBM::LogLevel::Debug);
}
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
6 participants