Skip to content

Conversation

@ammar-agent
Copy link
Collaborator

@ammar-agent ammar-agent commented Oct 24, 2025

Fixes the retry barrier flash that appeared briefly when sending a message, before the stream-start event arrived.

Problem

When sending a message, the retry barrier would briefly flash before disappearing:

  1. User message added to history (optimistic update)
  2. hasInterruptedStream() returns true for trailing user messages
  3. canInterrupt is still false (no active stream yet)
  4. Retry barrier renders briefly until stream starts

Solution

Track pendingStreamStart flag explicitly in the aggregator:

  • Set true when user message is received
  • Set false when stream-start event arrives
  • 30s timeout if stream never starts (safety net for truly stuck requests)
  • hasInterruptedStream() checks this flag to prevent flash

This is semantically correct: we explicitly track the window between "user message added" and "stream-start received" rather than inferring it from message timestamps.

Changes

  • StreamingMessageAggregator: Added pendingStreamStart flag with 30s timeout
  • WorkspaceStore: Exposed flag via workspace state
  • retryEligibility: Removed time-based hacks, added pendingStreamStart parameter
  • AIView: Removed forced re-render timer hack, use flag from state
  • Tests: Updated to test flag behavior instead of timestamps

Benefits

  • Semantically correct: explicitly models the state we care about
  • No frontend timer hacks - state updates drive UI naturally
  • Cleaner architecture: removed janky workarounds
  • 30s timeout is meaningful ("truly stuck") vs arbitrary ("hide for 2s")

Generated with cmux

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

When sending a message, the retry barrier would briefly appear before
the stream-start event arrived. This happened because:

1. User message added to history immediately
2. hasInterruptedStream() returns true for trailing user messages
3. canInterrupt is still false (no active stream yet)
4. Retry barrier renders briefly until stream starts

Fixed by ignoring very recent user messages (<2s) in
hasInterruptedStream(). This prevents the flash during normal send flow
while still catching truly interrupted streams after app restarts or
slow model responses.

Updated tests to verify the time-based behavior.
The initial fix prevented the flash but introduced a regression: if a send
truly hangs (no stream-start event arrives), the retry barrier would never
appear because React doesn't re-render just because time passes.

Fixed by adding a timer that forces a re-render 2.1s after a user message
is sent. This ensures the retry barrier appears even if the stream never
starts, while still preventing the flash during normal operation.
React hooks must be called in the same order on every render, so they
cannot be placed after conditional returns. Moved the timer useEffect
before the early return and adjusted it to safely check workspaceState.
The original fix used time-based hacks (checking message age, forced
re-renders with timers) to prevent retry barrier flash. This was janky
and semantically incorrect.

New approach:
- Track pendingStreamStart flag in StreamingMessageAggregator
- Set true when user message is received
- Set false when stream-start arrives
- 30s timeout if stream never starts (safety net)
- hasInterruptedStream() checks this flag instead of message age

Benefits:
- Semantically correct: explicitly tracks "waiting for stream-start"
- No frontend timer hacks - state updates drive UI naturally
- Cleaner: removed ~24 LoC of hacky code, added ~60 LoC of proper state
- The 30s timeout is meaningful (truly stuck) vs arbitrary ("hide for 2s")

Changes:
- StreamingMessageAggregator: +25 LoC (flag, timer, handlers)
- WorkspaceStore: +3 LoC (expose via state)
- retryEligibility: -14 LoC (removed time checks, added flag param)
- AIView: -27 LoC (removed timer hack)
- Tests: Updated to use flag instead of timestamps

Net: +40 LoC but much cleaner architecture.
Replace boolean flag with timestamp to gracefully handle stale pending states.
Instead of a 30s timeout with timer cleanup, check if timestamp is recent (<3s)
when evaluating retry eligibility.

Benefits:
- No timer management/cleanup needed
- UI naturally re-evaluates on re-render
- More graceful - shows barrier after 3s if stream truly hung
- Handles edge cases (app restart, etc.) better

Changes:
- StreamingMessageAggregator: Store timestamp instead of bool, remove timer
- retryEligibility: Check if timestamp < 3s old to prevent flash
- WorkspaceStore: Expose pendingStreamStartTime instead of bool
- AIView/useResumeManager: Pass timestamp to hasInterruptedStream
- Tests: Updated to verify timestamp-based logic (11 tests passing)
Compute showRetryBarrier once before early return, use for both
keybinds and UI rendering. Eliminates duplicate hasInterruptedStream call.
@ammario ammario added this pull request to the merge queue Oct 25, 2025
Merged via the queue into main with commit feb4da9 Oct 25, 2025
13 checks passed
@ammario ammario deleted the interrupt-flash branch October 25, 2025 15:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants