Skip to content

Conversation

@enyst
Copy link
Collaborator

@enyst enyst commented Nov 13, 2025

I'm OpenHands-GPT-5, proposing a focused fix to ensure LLM completion logs are routed to the correct client folders.

Summary

  • The original PR streams LLM completion logs but the event does not include usage_id. On the client, RemoteConversation keeps a usage_id -> folder map, so missing usage_id forces a fallback to the first folder, breaking the intended per-usage_id directory layout.

What this PR changes

  • Adds usage_id to LLMCompletionLogEvent
  • In the agent server, captures llm.usage_id when setting the telemetry callback and includes it in the emitted event
  • On the client, uses event.usage_id to pick the correct log_completions_folder, with a safe fallback for robustness

Rationale

  • We have direct access to llm.usage_id in EventService where callbacks are registered, so including it in the event is simple and reliable
  • Avoids guessing from model_name (which is ambiguous when multiple LLMs share the same model string)
  • Preserves the user’s original intent to store logs per usage_id

Backward compatibility

  • LLMCompletionLogEvent was introduced in the base PR; adding a field is backward-compatible within this change set and only affects the new streaming path

Implementation notes

  • Server-side: closure capture of usage_id in _setup_llm_log_streaming via make_log_callback(usage_id) keeps Telemetry generic and unchanged
  • Client-side: _create_llm_completion_log_callback now selects the folder by usage_id first, then falls back to the first configured directory if needed

Tests and quality

  • Pre-commit hooks (ruff fmt/lint, pycodestyle, pyright) pass on changed files
  • The change is minimal and focused; no unrelated edits

@enyst Could you please take a look? This should address the concern about identifying the correct log directory per usage_id while keeping the streaming approach intact.

@enyst can click here to continue refining the PR

…correct client folder

- Add usage_id to LLMCompletionLogEvent
- Capture llm.usage_id when registering telemetry callback in EventService
- Use event.usage_id in RemoteConversation to select log_completions_folder, with a safe fallback

Co-authored-by: openhands <openhands@all-hands.dev>
@github-actions
Copy link
Contributor

Coverage

Coverage Report •
FileStmtsMissCoverMissing
openhands-sdk/openhands/sdk/conversation/impl
   remote_conversation.py40528030%50–62, 69–72, 92–97, 100–104, 107–111, 114–116, 118–121, 124–127, 130–131, 133–145, 148–153, 170–174, 176, 180, 182–183, 185–188, 190, 196, 198, 200–202, 204–206, 210, 212–215, 219, 224–225, 227, 230, 239–240, 243–244, 257–259, 262–263, 267, 269–270, 273, 276–278, 282, 284, 286–288, 291–293, 298–300, 302, 307, 312, 317–320, 323, 332, 340–343, 346, 351–354, 356, 361–362, 367–371, 376–380, 385–388, 391, 396–398, 402–403, 407, 411, 414, 459–464, 467–469, 471, 473–474, 486, 489, 491–493, 496, 499, 501, 504, 507–508, 511–512, 515–517, 520, 522, 524–526, 530, 532–533, 536, 539, 542, 548, 551, 553–554, 556, 560, 562–564, 569, 572–573, 576–577, 579, 581, 584, 586, 588, 591–593, 595–597, 599, 603, 608, 612, 618, 625–627, 630, 635–637, 645–646, 653, 655–659, 662–663, 672–673, 682, 690, 695–697, 699, 702, 704–705, 722, 729, 735–736, 745, 747–751, 753, 756–759
openhands-sdk/openhands/sdk/event
   llm_completion_log.py11190%36
TOTAL6799430936% 

@enyst enyst marked this pull request as ready for review November 13, 2025 22:32
@xingyaoww xingyaoww merged commit b3576a5 into openhands/stream-llm-completion-logs Nov 14, 2025
6 checks passed
@xingyaoww xingyaoww deleted the fix/stream-llm-logs-include-usage-id branch November 14, 2025 16:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants