fix(memory): add Bun.gc, stream cancellation, and unconsumed fetch drains#3416
fix(memory): add Bun.gc, stream cancellation, and unconsumed fetch drains#3416waleedlatif1 merged 10 commits intostagingfrom
Conversation
PR SummaryMedium Risk Overview Optimizes executor stream capture by replacing incremental string concatenation with chunk buffering + Also disables Next.js Written by Cursor Bugbot for commit 72eb892. Configure here. |
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Greptile SummaryThis PR addresses a cluster of memory and resource-leak fixes targeting the Bun/Next.js server: it adds periodic opportunistic GC, properly drains six unconsumed Key changes:
Remaining concern:
Confidence Score: 4/5
Important Files Changed
Sequence DiagramsequenceDiagram
participant Chat as chat.tsx
participant Hook as use-workflow-execution.ts
participant Stream as ReadableStream start()
participant Exec as executeWorkflow (SSE)
participant Server as /api/workflows/execute
Chat->>Hook: handleRunWorkflow(chatInput)
Hook->>Stream: new ReadableStream({ start() })
Hook-->>Chat: { success: true, stream }
Chat->>Chat: processStreamingResponse(stream)
Stream->>Exec: executeWorkflow('chat')
Exec->>Server: SSE connection
Server-->>Exec: stream:chunk events
Exec-->>Stream: safeEnqueue(chunk) via onStream
Stream-->>Chat: reader.read() → chunk
Server-->>Exec: execution:completed event
Exec->>Hook: onExecutionCompleted (isExecutingFromChat=true → skip setIsExecuting)
Note over Hook: setIsExecuting NOT called here
Stream->>Stream: await Promise.all(streamReadingPromises)
Stream->>Stream: safeEnqueue(final event)
Stream->>Stream: controller.close()
Stream->>Hook: finally: setIsExecuting(false)
Chat->>Chat: reader.read() → done=true
Chat->>Chat: finalizeMessageStream()
Note over Chat: isStreaming=false
Last reviewed commit: 72eb892
|
|
@greptile |
|
@cursor review |
|
@greptile |
|
@cursor review |
|
@greptile |
|
@cursor review |
dbf94d1 to
ca16db0
Compare
The onExecutionCompleted/Error/Cancelled callbacks were setting isExecuting=false as soon as the server-side SSE stream completed. For chat executions, this triggered a useEffect in chat.tsx that cancelled the client-side stream reader before it finished consuming buffered data — causing empty or partial chat responses. Skip the isExecuting=false in these callbacks for chat executions since the chat's own finally block handles cleanup after the stream is fully consumed.
…tate change The effect reacted to isExecuting becoming false to clean up streams, but this is an anti-pattern per React guidelines — using state changes as a proxy for events. All cleanup cases are already handled by proper event paths: stream done (processStreamingResponse), user cancel (handleStopStreaming), component unmount (cleanup effect), and abort/error (catch block).
|
@greptile |
|
@cursor review |
| if (!isExecutingFromChat) { | ||
| setIsExecuting(activeWorkflowId, false) | ||
| setActiveBlocks(activeWorkflowId, new Set()) | ||
| } |
There was a problem hiding this comment.
setActiveBlocks still not cleared for chat executions on onExecutionCompleted
When isExecutingFromChat is true, setActiveBlocks(activeWorkflowId, new Set()) is skipped here and is deferred to the stream start() finally block (line 1042). However, there is a window between when onExecutionCompleted fires and when start() reaches its finally block (e.g., while await Promise.all(streamReadingPromises) is still running). During that window, the active blocks set remains non-empty in the execution store, which may cause the UI to continue showing block highlights after the server-side execution is done.
If clearing active blocks earlier is acceptable for the chat case, consider calling setActiveBlocks unconditionally here just like setExecutionResult is, since block highlighting doesn't depend on the client stream being fully consumed.
Summary
Type of Change
Testing
All 4216 tests pass. Verified tsc --noEmit clean.
Checklist