feat(llm_client): unified client package + state slot + s06 bridge#43
Merged
feat(llm_client): unified client package + state slot + s06 bridge#43
Conversation
Scaffold geny_executor.llm_client/ as the canonical LLM client package for the 16-stage pipeline, independent of stages/s06_api. - BaseClient (ABC) with capability-based feature filtering and create_message / create_message_stream APIs. - ClientCapabilities (frozen dataclass) with 7 capability flags + drops tuple; unsupported fields are silently dropped and emitted as llm_client.feature_unsupported events on an optional event_sink. - ClientRegistry with lazy factories for anthropic / openai / google / vllm. VLLMClient inherits from OpenAIClient and enforces base_url. - ProviderBackedClient bridge wraps s06_api APIProvider so existing pipelines keep working during the PR-3 -> PR-4 transition. - Canonical APIRequest / APIResponse / ContentBlock types live in llm_client/types.py; stages/s06_api/types.py is now a re-export shim (import path stability). - translators/__init__.py re-exports translate helpers from stages/s06_api/_translate during the bridge (PR-4 inverts this). - PipelineState.llm_client: Optional[Any] slot added. - Pipeline.attach_runtime(llm_client=...) kwarg + _resolve_llm_client auto-bridge that wraps the s06_api provider via ProviderBackedClient when no explicit client is attached. Tests: - tests/unit/test_llm_client_base.py (7) - tests/unit/test_llm_client_registry.py (7) - tests/unit/test_llm_client_state.py (5) Full suite: 1086 passed, 18 skipped. No regressions. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Replaces #39 (auto-closed when its base branch was deleted on merge). Rebased onto main.
Summary
geny_executor/llm_client/package: BaseClient ABC, ClientCapabilities, ClientRegistry lazy factory, AnthropicClient / OpenAIClient / GoogleClient / VLLMClient.PipelineState.llm_clientslot +Pipeline.attach_runtime(llm_client=…).ProviderBackedClientbridge so legacy APIProvider code keeps working.APIRequest/APIResponse/ContentBlockmoved tollm_client/types.py;stages/s06_api/types.pykept as a shim.Test plan
Original PR: #39. Part of cycle 20260421_4 (plan 03).
🤖 Generated with Claude Code