-
Notifications
You must be signed in to change notification settings - Fork 14.1k
Open
Labels
coreAnything pertaining to core functionality of the application (opencode server stuff)Anything pertaining to core functionality of the application (opencode server stuff)
Description
Summary
Requests to Z.AI API (api.z.ai) fail with ECONNRESET after ~40-100 seconds when using the zai-coding-plan provider. The same API works correctly via curl and in other tools (e.g., OpenClaw via VPS proxy).
Environment
- OpenCode: v1.2.15
- Platform: macOS (arm64)
- Provider: zai-coding-plan
- Model: glm-5
- Endpoint:
https://api.z.ai/api/coding/paas/v4/chat/completions
Error Details
From ~/Library/Logs/ai.opencode.desktop/opencode-desktop_*.log:
ERROR service=llm providerID=zai-coding-plan modelID=glm-5 sessionID=ses_xxx small=false agent=build mode=primary
error={"error":{"code":"ECONNRESET","path":"https://api.z.ai/api/coding/paas/v4/chat/completions","errno":0}} stream error
Timing pattern of failures (from logs):
- Most failures occur after 39-100 seconds
- Some as long as 1000+ seconds
- 0 successful requests logged (101 ECONNRESET errors in one session)
Reproduction
- Configure
zai-coding-planprovider with a valid API key viaopencode auth login - Select
glm-5model - Send any prompt
- After ~40-100 seconds:
ECONNRESETerror
Verification that API works
Direct curl requests work fine:
curl -X POST "https://api.z.ai/api/coding/paas/v4/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <key>" \
-d '{"model":"glm-5","messages":[{"role":"user","content":"hi"}],"stream":true}'
# Returns streaming response successfullySame API also works when routed through a VPS proxy (tested with OpenClaw which proxies through a France VPS).
Analysis
The issue appears to be with Bun's HTTP client in the OpenCode sidecar:
- Bun's connection pooling: Bun aggressively reuses HTTP connections (keep-alive)
- Z.AI's idle timeout: Server likely closes idle connections after 30-60 seconds
- Race condition: When Bun tries to reuse a connection that the server has already closed →
ECONNRESET
The sidecar process (opencode-cli serve) makes the actual API calls:
/Applications/OpenCode.app/Contents/MacOS/opencode-cli --print-logs --log-level WARN serve --hostname 127.0.0.1 --port XXXXX
Suggested Fix
- Disable keep-alive for Z.AI endpoints, or
- Add retry logic with fresh connection on
ECONNRESET, or - Reduce connection pool lifetime for streaming endpoints
Workaround Attempts
- Setting
HTTPS_PROXYenvironment variable: doesn't help (sidecar is separate process) - Bun config (
~/.bunfig.toml): OpenCode binary appears to ignore it - VPN: issue persists regardless of network route
Related
- Z.AI documentation for OpenCode: https://docs.z.ai/integrations/opencode
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
coreAnything pertaining to core functionality of the application (opencode server stuff)Anything pertaining to core functionality of the application (opencode server stuff)