Skip to content

feat(llm_client): unified client package + state slot + s06 bridge#43

Merged
CocoRoF merged 1 commit intomainfrom
feat/llm-client-package
Apr 21, 2026
Merged

feat(llm_client): unified client package + state slot + s06 bridge#43
CocoRoF merged 1 commit intomainfrom
feat/llm-client-package

Conversation

@CocoRoF
Copy link
Copy Markdown
Owner

@CocoRoF CocoRoF commented Apr 21, 2026

Replaces #39 (auto-closed when its base branch was deleted on merge). Rebased onto main.

Summary

  • New geny_executor/llm_client/ package: BaseClient ABC, ClientCapabilities, ClientRegistry lazy factory, AnthropicClient / OpenAIClient / GoogleClient / VLLMClient.
  • PipelineState.llm_client slot + Pipeline.attach_runtime(llm_client=…).
  • ProviderBackedClient bridge so legacy APIProvider code keeps working.
  • Canonical APIRequest / APIResponse / ContentBlock moved to llm_client/types.py; stages/s06_api/types.py kept as a shim.

Test plan

  • Full suite green (1086 passed, 18 skipped at PR-3 time).

Original PR: #39. Part of cycle 20260421_4 (plan 03).

🤖 Generated with Claude Code

Scaffold geny_executor.llm_client/ as the canonical LLM client
package for the 16-stage pipeline, independent of stages/s06_api.

- BaseClient (ABC) with capability-based feature filtering and
  create_message / create_message_stream APIs.
- ClientCapabilities (frozen dataclass) with 7 capability flags +
  drops tuple; unsupported fields are silently dropped and emitted
  as llm_client.feature_unsupported events on an optional event_sink.
- ClientRegistry with lazy factories for anthropic / openai / google
  / vllm. VLLMClient inherits from OpenAIClient and enforces base_url.
- ProviderBackedClient bridge wraps s06_api APIProvider so existing
  pipelines keep working during the PR-3 -> PR-4 transition.
- Canonical APIRequest / APIResponse / ContentBlock types live in
  llm_client/types.py; stages/s06_api/types.py is now a re-export
  shim (import path stability).
- translators/__init__.py re-exports translate helpers from
  stages/s06_api/_translate during the bridge (PR-4 inverts this).
- PipelineState.llm_client: Optional[Any] slot added.
- Pipeline.attach_runtime(llm_client=...) kwarg + _resolve_llm_client
  auto-bridge that wraps the s06_api provider via ProviderBackedClient
  when no explicit client is attached.

Tests:
- tests/unit/test_llm_client_base.py (7)
- tests/unit/test_llm_client_registry.py (7)
- tests/unit/test_llm_client_state.py (5)

Full suite: 1086 passed, 18 skipped. No regressions.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
@CocoRoF CocoRoF merged commit 0bdcf13 into main Apr 21, 2026
7 checks passed
@CocoRoF CocoRoF deleted the feat/llm-client-package branch April 21, 2026 09:25
@CocoRoF CocoRoF mentioned this pull request Apr 21, 2026
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant