Skip to content

Inject native LLM client into session service#27065

Draft
kitlangton wants to merge 1 commit into
llm-native-runtime-openaifrom
llm-native-inject-client
Draft

Inject native LLM client into session service#27065
kitlangton wants to merge 1 commit into
llm-native-runtime-openaifrom
llm-native-inject-client

Conversation

@kitlangton
Copy link
Copy Markdown
Contributor

@kitlangton kitlangton commented May 12, 2026

Summary

  • make opencode's LLM service consume an injectable native LLMClient.Service
  • keep production defaults by wiring LLMClient.layer with RequestExecutor.defaultLayer in LLM.defaultLayer
  • add a native runtime test with an injected request executor that scripts a tool call and verifies the tool receives the native call id

Verification

  • cd packages/opencode && bun typecheck
  • cd packages/opencode && bun run test -- test/session/llm.test.ts --test-name-pattern "injected native|native runtime"
  • pre-push root bun turbo typecheck

Stack

  1. Consume native LLM events in session processing #26639
  2. Add native LLM request adapter #26941
  3. Compile native LLM requests in session tests #26946
  4. Add native OpenAI runtime opt-in #26947
  5. Inject native LLM client into session service #27065 👈 current

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant