Skip to content

fix(chat): implement stop generation button functionality#1157

Merged
fbricon merged 1 commit intokortex-hub:mainfrom
fbricon:GH-535
Mar 26, 2026
Merged

fix(chat): implement stop generation button functionality#1157
fbricon merged 1 commit intokortex-hub:mainfrom
fbricon:GH-535

Conversation

@fbricon
Copy link
Copy Markdown
Contributor

@fbricon fbricon commented Mar 23, 2026

  • Wire abort signal through IPC layer to cancel AI response streams
  • Make inferenceStreamText return onDataId synchronously for immediate abort listener registration
  • Store AbortController per stream with chatId tracking
  • Abort active streams when chat is deleted
  • Add E2E test [CHAT-17] for stop generation
stop-message.mp4

Fixes #535

Co-Authored-By: Claude Sonnet 4.5 noreply@anthropic.com
Signed-off-by: Fred Bricon fbricon@gmail.com

@fbricon fbricon requested a review from a team as a code owner March 23, 2026 17:14
@fbricon fbricon requested review from benoitf and bmahabirbu and removed request for a team March 23, 2026 17:14
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 23, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: e1b1bbf5-699b-4eae-982f-a86374fd3b32

📥 Commits

Reviewing files that changed from the base of the PR and between d6588e0 and 0b2a3d4.

📒 Files selected for processing (5)
  • packages/main/src/chat/chat-manager.ts
  • packages/preload/src/index.ts
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
  • tests/playwright/src/model/pages/chat-page.ts
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
🚧 Files skipped from review as they are similar to previous changes (4)
  • packages/main/src/chat/chat-manager.ts
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
  • packages/preload/src/index.ts

📝 Walkthrough

Walkthrough

Main process now tracks active inference streams with AbortControllers and exposes an IPC to stop them; preload returns stream callback IDs synchronously and exposes a stop API; renderer starts streams without awaiting and forwards aborts to preload; tests add a stop-button helper and a smoke test for cancelling generation. (36 words)

Changes

Cohort / File(s) Summary
Main Process Stream Management
packages/main/src/chat/chat-manager.ts
Added activeStreams map, register inference:stopStream IPC, stopStream(onDataId) and stopStreamsByChat(chatId) helpers; attach AbortController to streamText calls, ensure cleanup (delete activeStreams entry) and emit inference:streamText-onEnd in finally; abort streams on chat deletion.
IPC / Preload Bridge
packages/preload/src/index.ts
inferenceStreamText now returns callback id synchronously after registering callbacks and invokes ipcInvoke('inference:streamText', ...) without awaiting, with .catch cleanup calling stored onError/onEnd. Added inferenceStopStream(onDataId) that forwards stop requests to main.
Renderer Stream Transport
packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
Made ReadableStream start synchronous; capture options.abortSignal; call window.inferenceStreamText() without awaiting to get onDataId; call window.inferenceStopStream(onDataId) immediately if already aborted or on abort event (errors logged).
Playwright Tests
tests/playwright/src/model/pages/chat-page.ts, tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
Added ChatPage.clickStopButton() and a serial smoke test [CHAT-17] Stop generation cancels the AI response stream that asserts mid-generation UI state, triggers cancellation, and verifies UI returns to ready state while preserving the user message.

Sequence Diagram(s)

sequenceDiagram
    actor User
    participant Renderer as Renderer Process
    participant Preload as Preload/IPC Bridge
    participant Main as Main Process
    participant API as Inference API

    User->>Renderer: Start text generation
    Renderer->>Preload: inferenceStreamText(params, callbacks)
    Preload->>Main: ipcInvoke('inference:streamText', ...)
    Main->>Main: create AbortController, store in activeStreams(onDataId)
    Main->>API: streamText(..., abortSignal)
    API-->>Main: stream chunks
    Main-->>Preload: inference:streamText-onData (chunks)
    Preload-->>Renderer: onChunk callback

    User->>Renderer: Click Stop
    Renderer->>Preload: inferenceStopStream(onDataId)
    Preload->>Main: ipcInvoke('inference:stopStream', onDataId)
    Main->>Main: abort controller (activeStreams)
    Main->>API: abort stream
    API-->>Main: stream ended
    Main-->>Preload: inference:streamText-onEnd
    Preload-->>Renderer: onEnd callback
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and specifically summarizes the main change: implementing stop generation button functionality, which is the primary objective of the PR.
Description check ✅ Passed The description is directly related to the changeset, detailing the key implementation aspects of stop generation functionality including abort signals, synchronous returns, AbortController storage, stream cleanup, and E2E tests.
Linked Issues check ✅ Passed The PR fully addresses issue #535 by implementing the stop generation button functionality through abort signal wiring, IPC layer changes, stream tracking with AbortController, and E2E testing for the feature.
Out of Scope Changes check ✅ Passed All changes are within scope: abort signal implementation in ChatManager, IPC layer modifications for stream control, UI transport updates for abort handling, and E2E tests all directly support the stop generation button functionality.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@codecov
Copy link
Copy Markdown

codecov bot commented Mar 23, 2026

Codecov Report

❌ Patch coverage is 2.70270% with 72 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
packages/main/src/chat/chat-manager.ts 1.96% 50 Missing ⚠️
packages/preload/src/index.ts 7.69% 12 Missing ⚠️
...erer/src/lib/chat/components/ipc-chat-transport.ts 0.00% 10 Missing ⚠️

📢 Thoughts on this report? Let us know!

Copy link
Copy Markdown
Contributor

@bmahabirbu bmahabirbu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM one quesiton I had and itll be all set

Comment on lines +342 to +343
this.activeStreams.delete(params.onDataId);
this.webContents.send('inference:streamText-onEnd', params.onDataId);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

activeStreams.delete() in stopStream and the one in the finally block can both run so its doing a double delete? should be fine but just a question

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I asked Claude, why not just rely on the code in the finally block?


The Real Issue: Event Loop Timing

The timing window is about async execution order in the event loop:

  private stopStream(onDataId: number): void {
    const stream = this.activeStreams.get(onDataId);
    if (stream) {
      stream.controller.abort();
      // No delete here - relies on finally
    }
  }

What actually happens:

  1. stopStream() called (synchronous)
  2. controller.abort() triggers abort signal (synchronous)
  3. stopStream() returns (synchronous)
  4. Event loop continues - other code can execute
  5. Eventually, the stream's reader.read() Promise rejects/resolves
  6. finally block runs

Between steps 3-6, the map still has the entry even though we've "stopped" it.

Realistic Problematic Scenario

// Renderer calls: stop stream
await window.inferenceStopStream(123);
// ^ This returns immediately after stopStream() executes

// Immediately after (same tick or next tick):
await window.deleteChat(chatId);
→ stopStreamsByChat(chatId) runs
→ iterates activeStreams
→ finds entry 123 (STILL THERE - finally hasn't run yet!)
→ aborts the already-aborted controller
→ deletes entry

// Later (when microtask queue processes):
// The finally block from the original stream runs
→ tries to delete (already deleted)
→ sends onEnd event

The issue is async ordering. The finally block runs "eventually" when the Promise settles, but synchronous code keeps executing.

Why Immediate Delete Matters

With delete in stopStream():

  private stopStream(onDataId: number): void {
    const stream = this.activeStreams.get(onDataId);
    if (stream) {
      stream.controller.abort();
      this.activeStreams.delete(onDataId);  // Synchronous cleanup
    }
  }

Now:
window.inferenceStopStream(123);
// Map entry deleted IMMEDIATELY (synchronous)

window.deleteChat(chatId);
→ stopStreamsByChat(chatId)
→ iterates activeStreams
→ entry 123 is GONE
→ nothing to process

The Real Benefits

  1. Synchronous state consistency: The map is correct immediately after stopStream() returns, not "eventually"
  2. Prevents redundant operations: stopStreamsByChat() won't re-abort already-stopped streams
  3. Predictable state: Caller can trust the map state without waiting for Promises to settle
  4. finally as safety net: Catches cases we didn't explicitly stop (natural completion, errors)

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That looks fine to me!

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
packages/main/src/chat/chat-manager.ts (1)

261-343: ⚠️ Potential issue | 🔴 Critical

Make cancellation observable before the async setup starts.

activeStreams.set() is not reached until Line 307, after multiple awaits. If the renderer calls inference:stopStream or deletes the chat while streamText() is still in setup, both stop paths miss the entry and the generation can still start; for a brand-new chat, the remaining setup can even recreate a chat the user just deleted. The finally at Line 341 is also too narrow, because anything that throws before getReader() skips the cleanup and never emits onEnd. Create/store the controller before the first await, and re-check signal.aborted before mutating chat state or starting the stream.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/main/src/chat/chat-manager.ts` around lines 261 - 343, streamText
creates and registers the AbortController too late and its final cleanup is too
narrow, so cancellations during the async setup or errors before getReader() are
missed; fix by creating the AbortController at the top of streamText (before any
awaits), immediately storing it in activeStreams with params.onDataId, and then
after each awaited step that may mutate chat state (e.g. before calling
this.chatQueries.saveChat, before calling this.generateTitleInBackground, and
before starting the stream from streamText(...) / model inference) re-check
controller.signal.aborted and bail out if set; also broaden the try/finally so
the activeStreams.delete(params.onDataId) and
this.webContents.send('inference:streamText-onEnd', ...) run for any early throw
or abort to guarantee cleanup.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@packages/preload/src/index.ts`:
- Around line 1398-1415: The catch in the inferenceStreamText handler currently
only calls callback.onError and leaves the entry in onDataCallbacksStreamText,
preventing normal termination; update the catch block for the
ipcInvoke('inference:streamText') call so that after calling callback.onError it
also removes the callback from onDataCallbacksStreamText (using
onDataCallbacksStreamText.delete(id)) and invokes callback.onEnd() to finalize
the stream; ensure you reference onDataCallbacksStreamTextId and the returned id
variable as currently used so the correct map entry is cleaned up.

---

Outside diff comments:
In `@packages/main/src/chat/chat-manager.ts`:
- Around line 261-343: streamText creates and registers the AbortController too
late and its final cleanup is too narrow, so cancellations during the async
setup or errors before getReader() are missed; fix by creating the
AbortController at the top of streamText (before any awaits), immediately
storing it in activeStreams with params.onDataId, and then after each awaited
step that may mutate chat state (e.g. before calling this.chatQueries.saveChat,
before calling this.generateTitleInBackground, and before starting the stream
from streamText(...) / model inference) re-check controller.signal.aborted and
bail out if set; also broaden the try/finally so the
activeStreams.delete(params.onDataId) and
this.webContents.send('inference:streamText-onEnd', ...) run for any early throw
or abort to guarantee cleanup.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 96a0065e-fabe-4c51-82c1-f572e6e6dc9c

📥 Commits

Reviewing files that changed from the base of the PR and between 42e9233 and 1d70bc2.

📒 Files selected for processing (5)
  • packages/main/src/chat/chat-manager.ts
  • packages/preload/src/index.ts
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
  • tests/playwright/src/model/pages/chat-page.ts
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
🚧 Files skipped from review as they are similar to previous changes (3)
  • tests/playwright/src/model/pages/chat-page.ts
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts

@fbricon fbricon force-pushed the GH-535 branch 4 times, most recently from 031c02b to d6588e0 Compare March 26, 2026 15:26
- Wire abort signal through IPC layer to cancel AI response streams
- Make inferenceStreamText return onDataId synchronously for immediate abort listener registration
- Create AbortController early (before awaits) to handle cancellations during async setup
- Add abort checks after each async operation (saveChat, getInferenceComponents, saveMessages)
- Store AbortController per stream with chatId tracking
- Abort active streams when chat is deleted
- Add E2E test [CHAT-17] for stop generation

Fixes kortex-hub#535

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Signed-off-by: Fred Bricon <fbricon@gmail.com>
Copy link
Copy Markdown
Contributor

@MarsKubeX MarsKubeX left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Tested in MacOS. Thanks for the fix.

@fbricon fbricon merged commit 492bf13 into kortex-hub:main Mar 26, 2026
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

The "Stop Generation" button is unresponsive and does not cancel the AI's response stream

3 participants