-
Notifications
You must be signed in to change notification settings - Fork 11.5k
Description
Description
The TUI freezes or hangs after LLM streaming completes. Several independent bugs contribute:
-
Stream never terminates:
for await (stream.fullStream)inprocessor.tsdoesn't break on thefinishevent — it only breaks onneedsCompaction. After the LLM finishes streaming, the loop continues waiting indefinitely, causing a 0% CPU hang. -
No timeout on
client.listTools(): Inmcp/index.ts,client.listTools()has no timeout wrapper. If an MCP server is unresponsive, this hangs indefinitely. -
Unhandled effect rejections: In
db.ts, fire-and-forget effects (effect()) can reject silently, causing unhandled promise rejections that propagate unpredictably. -
SDK batch window too tight: The 16ms event batch window in
sdk.tsxcauses excessive render churn during high-throughput streaming (10+ tokens/sec), amplifying GC pressure.
Steps to Reproduce
- Run opencode against a large project
- Ask the LLM a question that produces a long response
- Wait for streaming to complete
- TUI becomes unresponsive or hangs at 0% CPU
Expected Behavior
TUI should remain responsive during and after streaming, and the stream loop should exit cleanly on the finish event.