Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[python-package] Expose ObjectiveFunction class #6586

Open
wants to merge 18 commits into
base: master
Choose a base branch
from
Prev Previous commit
Next Next commit
Make test single initialization
  • Loading branch information
Atanas Dimitrov committed Sep 2, 2024
commit dd8b6924c9c0fb94f5a7f14ce4ac63a4183c76f4
15 changes: 14 additions & 1 deletion tests/python_package_test/test_engine.py
Original file line number Diff line number Diff line change
@@ -24,7 +24,6 @@

from .utils import (
SERIALIZERS,
builtin_objective,
dummy_obj,
load_breast_cancer,
load_digits,
@@ -4411,6 +4410,20 @@ def test_quantized_training():
)
@pytest.mark.skipif(getenv("TASK", "") == "cuda", reason="Skip due to ObjectiveFunction not exposed for cuda devices.")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why couldn't this also be exposed for the CUDA implementation?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It segfaults on the CI tests, and I cannot build the CUDA version on MacOS.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where exactly does it segfault? 🤔 it seems like this should work 😅

def test_objective_function_class(use_weight, num_boost_round, custom_objective, objective_name, df, num_class):
def builtin_objective(name, params):
fobj = lgb.ObjectiveFunction(name, params)

def loss(y_pred, dtrain):
if fobj.num_data is None:
fobj.init(dtrain)
(grad, hess) = fobj.get_gradients(y_pred)
if fobj.num_class != 1:
grad = grad.reshape((fobj.num_class, -1)).transpose()
hess = hess.reshape((fobj.num_class, -1)).transpose()
return (grad, hess)

return loss

X, y = df
rng = np.random.default_rng()
weight = rng.choice([1, 2], y.shape) if use_weight else None
15 changes: 0 additions & 15 deletions tests/python_package_test/utils.py
Original file line number Diff line number Diff line change
@@ -168,21 +168,6 @@ def multiclass_custom_objective(y_pred, ds):
return grad, hess


def builtin_objective(name, params):
"""Mimics the builtin objective functions to mock training."""

def wrapper(y_pred, dtrain):
fobj = lgb.ObjectiveFunction(name, params)
fobj.init(dtrain)
(grad, hess) = fobj.get_gradients(y_pred)
if fobj.num_class != 1:
grad = grad.reshape((fobj.num_class, -1)).transpose()
hess = hess.reshape((fobj.num_class, -1)).transpose()
return (grad, hess)

return wrapper


def pickle_obj(obj, filepath, serializer):
if serializer == "pickle":
with open(filepath, "wb") as f:
Loading
Oops, something went wrong.