Skip to content

Conversation

@datvo06
Copy link

@datvo06 datvo06 commented Oct 26, 2025

Fix #381

@datvo06 datvo06 requested a review from jfeser October 26, 2025 23:01
Copy link
Contributor

@jfeser jfeser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks mostly good! Two comments:

  1. llm_request should have essentially the same interface as client.responses.create:
@defop
def llm_request(client: openai.OpenAI, *args, **kwargs) -> Any:
    """Low-level LLM request. Handlers may log/modify requests and delegate via fwd()."""
    return client.responses.create(*args, **kwargs)
  1. The notebook needs to be rerun since it has an exception.

@jfeser
Copy link
Contributor

jfeser commented Oct 26, 2025

You don't have to add it in this PR, but I'd be happy to have a llm logging handler that exposes a python logger and logs llm_request and tool_call.

@datvo06
Copy link
Author

datvo06 commented Oct 26, 2025

You don't have to add it in this PR, but I'd be happy to have a llm logging handler that exposes a python logger and logs llm_request and tool_call.

I see, that makes sense, I can make a short attempt, if it is too hard, then I'll revert.

@datvo06
Copy link
Author

datvo06 commented Oct 27, 2025

I added the logger that expose Python logger:

# 1. Create a logger
logger = logging.getLogger("effectful.llm")
logger.setLevel(logging.INFO)
h = logging.StreamHandler(sys.stdout)
h.setLevel(logging.INFO)
h.setFormatter(logging.Formatter("%(levelname)s %(name)s %(message)s %(payload)s"))
logger.addHandler(h)

# 2. Pass it to the handler
llm_logger = LLMLoggingHandler(logger=logger)

# Avoid cache for demonstration
try:
    haiku.cache_clear()
except Exception:
    pass

with handler(provider), handler(llm_logger):
    _ = haiku("fish3")
    _ = limerick("fish4")

@datvo06 datvo06 requested a review from jfeser October 27, 2025 00:03
*,
logger: logging.Logger | None = None,
logger_name: str = "effectful.llm",
level: int = logging.INFO,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This shouldn't be a configuration parameter. Instead the logger should be exposed as an attribute.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still outstanding.

@datvo06 datvo06 requested a review from jfeser October 27, 2025 15:06
*,
logger: logging.Logger | None = None,
logger_name: str = "effectful.llm",
level: int = logging.INFO,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still outstanding.

@datvo06 datvo06 requested a review from jfeser October 27, 2025 17:26
@jfeser jfeser linked an issue Oct 27, 2025 that may be closed by this pull request
Copy link
Contributor

@jfeser jfeser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Happy to merge once you address outstanding comments.

@datvo06
Copy link
Author

datvo06 commented Oct 28, 2025

Updated: only have one optional parameter logger in LLMLoggingHandler.

@jfeser jfeser merged commit 54efb77 into staging-llm Oct 28, 2025
5 checks passed
@jfeser jfeser deleted the dev_log_llm branch October 28, 2025 15:46
@jfeser jfeser added this to the LLM Infrastructure milestone Nov 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add logging handler for LLM api provider

3 participants