Merged
Conversation
Two bugs prevented OpenAI models from working through the llmspy gateway:
1. The ConfigMap providers.json had `"npm": "openai"` but the OpenAI provider
registers with `sdk = "@ai-sdk/openai"`. The mismatch caused create_provider()
to return None, so the provider was never added to g_handlers and got
auto-disabled at serve time.
2. OpenClaw sends `stream_options` in chat completion requests. llmspy forces
`stream=false` (it collects the full response and re-chunks if needed) but
didn't strip `stream_options`. OpenAI rejects the combination with
"stream_options is only allowed when stream is enabled". The init container
now patches main.py to add `chat.pop("stream_options", None)` after the
stream override, with PYTHONPATH loading the patched module.
Also bumps the llmspy image from 3.0.33-obol.2 to 3.0.34-obol.1.
Integration test improvements:
- Remove max_tokens parameter (gpt-5.2 requires max_completion_tokens)
- Add requireLLMSpyProvider() to skip tests when provider is auto-disabled
- Add error pattern detection for upstream errors wrapped in 200 responses
- Add Google and Z.AI inference tests
- Add response body logging for diagnostics
OisinKyne
approved these changes
Feb 24, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
providers.jsonhad"npm": "openai"butOpenAiProviderregisters withsdk = "@ai-sdk/openai"— the mismatch caused the provider to be silently auto-disabled at startupstream_optionscausing OpenAI 500: OpenClaw sendsstream_optionsin requests; llmspy forcesstream=falsebut didn't stripstream_options, causing OpenAI to reject with "stream_options is only allowed when stream is enabled". Init container now patches this at deploy time3.0.33-obol.2to3.0.34-obol.1max_tokens, addrequireLLMSpyProvider()guard, add error pattern detection, add Google and Z.AI inference testsRoot Causes
Bug 1: npm SDK mismatch (llm.yaml line 93)
create_provider()in llmspy matchesprovider.get("npm")fromproviders.jsonagainstprovider_type.sdkfrom Python provider classes. Our ConfigMap override had"npm": "openai"which overwrote the package's correct"npm": "@ai-sdk/openai"during the init container merge. No match → provider not created → auto-disabled.Bug 2: stream_options passthrough (llm.yaml init container)
llmspy's
process_chat()forceschat["stream"] = False(it collects full responses and re-chunks for streaming clients), but didn't removestream_optionsfrom the request body. OpenAI strictly validates thatstream_optionsrequiresstream=true. The init container now copies the llms package to a writable volume, patchesmain.pyto addchat.pop("stream_options", None), andPYTHONPATHis set to load the patched version.Test plan
stream_optionsfix via direct llmspy test (HTTP 200 withstream_optionsafter patch)llms.json(all 5 providers stay enabled after restart)