Skip to content

TUI freezes/hangs after LLM streaming completes #15310

@coleleavitt

Description

@coleleavitt

Description

The TUI freezes or hangs after LLM streaming completes. Several independent bugs contribute:

  1. Stream never terminates: for await (stream.fullStream) in processor.ts doesn't break on the finish event — it only breaks on needsCompaction. After the LLM finishes streaming, the loop continues waiting indefinitely, causing a 0% CPU hang.

  2. No timeout on client.listTools(): In mcp/index.ts, client.listTools() has no timeout wrapper. If an MCP server is unresponsive, this hangs indefinitely.

  3. Unhandled effect rejections: In db.ts, fire-and-forget effects (effect()) can reject silently, causing unhandled promise rejections that propagate unpredictably.

  4. SDK batch window too tight: The 16ms event batch window in sdk.tsx causes excessive render churn during high-throughput streaming (10+ tokens/sec), amplifying GC pressure.

Steps to Reproduce

  1. Run opencode against a large project
  2. Ask the LLM a question that produces a long response
  3. Wait for streaming to complete
  4. TUI becomes unresponsive or hangs at 0% CPU

Expected Behavior

TUI should remain responsive during and after streaming, and the stream loop should exit cleanly on the finish event.

Metadata

Metadata

Assignees

Labels

coreAnything pertaining to core functionality of the application (opencode server stuff)perfIndicates a performance issue or need for optimization

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions