Skip to content

Conversation

@dzianisv
Copy link
Owner

@dzianisv dzianisv commented Feb 1, 2026

Summary

When network issues occur mid-stream (TCP half-open connections, stalled LLM providers, proxy timeouts), the streaming loop would hang forever waiting for the next chunk. This fix adds an idle timeout that detects when no data has been received for a configurable period (default 60 seconds) and triggers a retry with exponential backoff.

Problem

The for await (const value of stream.fullStream) loop in processor.ts has no idle timeout. If the network drops mid-stream or the LLM provider stops sending data without closing the connection, the loop hangs indefinitely.

Related issues: anomalyco#8383, anomalyco#2512, anomalyco#2819, anomalyco#4255

Solution

  • Add StreamIdleTimeoutError class to detect stalled streams
  • Add withIdleTimeout() async generator wrapper that races each chunk against a timeout
  • Wrap stream.fullStream with the idle timeout in the process loop
  • Mark StreamIdleTimeoutError as retryable so sessions recover automatically
  • Add stream_idle_timeout config option under experimental settings (default: 60s)
  • Add additional network error types (ETIMEDOUT, ENOTFOUND, etc) as retryable

Testing

  • ✅ Unit tests for withIdleTimeout() logic (normal streams, stalled streams, slow streams, abort signals)
  • ✅ All 843 existing tests pass
  • ✅ TypeScript compilation passes for all 12 packages

Configuration

Users can adjust the timeout in their opencode.json:

{
  "experimental": {
    "stream_idle_timeout": 60000
  }
}

Set to 0 to disable the idle timeout entirely.

…itely

When network issues occur mid-stream (TCP half-open connections, stalled LLM
providers, proxy timeouts), the streaming loop would hang forever waiting for
the next chunk. This fix adds an idle timeout that detects when no data has
been received for a configurable period (default 60 seconds) and triggers a
retry with exponential backoff.

Changes:
- Add StreamIdleTimeoutError class to message-v2.ts
- Add withIdleTimeout() async generator wrapper in processor.ts
- Wrap stream.fullStream with idle timeout in the process loop
- Mark StreamIdleTimeoutError as retryable so sessions recover automatically
- Add stream_idle_timeout config option under experimental settings
- Add additional network error types (ETIMEDOUT, ENOTFOUND, etc) as retryable

Fixes issues where sessions get stuck for hours on LLM requests.
Related: anomalyco#8383, anomalyco#2512, anomalyco#2819, anomalyco#4255
@github-actions
Copy link

github-actions bot commented Feb 1, 2026

Thanks for your contribution!

This PR doesn't have a linked issue. All PRs must reference an existing issue.

Please:

  1. Open an issue describing the bug/feature (if one doesn't exist)
  2. Add Fixes #<number> or Closes #<number> to this PR description

See CONTRIBUTING.md for details.

@dzianisv dzianisv merged commit 9f9df44 into dev Feb 1, 2026
3 of 6 checks passed
@dzianisv dzianisv deleted the fix/stream-idle-timeout branch February 1, 2026 19:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants