Conversation
📝 WalkthroughSummary by CodeRabbit
WalkthroughReplaces local signal-based chat state with a controller-driven design: adds a chat state machine and createChatController, ChatProvider now constructs/publishes a ChatController, and multiple components were updated to use controller accessors/events for send/stream lifecycle and paywall handling. Changes
Possibly related PRs
🚥 Pre-merge checks | ✅ 2✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Comment |
There was a problem hiding this comment.
Actionable comments posted: 6
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@js/app/packages/block-chat/component/Chat.tsx`:
- Around line 101-105: The createEffect block synchronizes isGenerating and
already calls invalidateUserQuota() when chat.isGenerating() becomes true, so
remove the redundant explicit invalidateUserQuota() call from the attachStream
path: locate the attachStream-related code that calls invalidateUserQuota() (the
call near the attachStream handling) and delete that call so only the
createEffect (with input.setIsGenerating and chat.isGenerating) triggers quota
invalidation; ensure createEffect, input.setIsGenerating, chat.isGenerating, and
invalidateUserQuota remain unchanged.
In `@js/app/packages/core/component/AI/state/chatState.ts`:
- Around line 57-167: Replace the large switch(event.type) block with a
ts-pattern match on event (import match from 'ts-pattern') and convert each case
('send_started', 'send_failed', 'stream_connected', 'stream_user_message',
'stream_done', 'stream_error') into corresponding .with(...) handlers that
return the same objects (preserve phase, messages, effects, optimisticMessageId
and the exact message-patching logic), then finish with .exhaustive() to ensure
compile-time exhaustiveness; keep all existing guards (phase.type checks and
console.warn calls) and the same behavior for optimistic ID handling and
stream_done/message append logic so only the switch-to-match surface is changed.
- Around line 73-82: On send_failed you need to remove the optimistic user
message that was added during send_started to avoid leaving a ghost message; in
the case handling for 'send_failed' (inside the reducer handling phase.type
'sending') update the returned phase.messages to drop the last optimistic
message (or filter out messages with the optimistic flag/temporary id) before
returning { phase: { type: 'idle', messages: [...] }, effects }, and preserve
the existing show_paywall effect logic when event.paymentError is true. Ensure
you reference the same message identification used when creating the optimistic
message (e.g., its temporary id or an optimistic boolean) so only the optimistic
message is removed.
In `@js/app/packages/core/component/AI/state/createChatController.ts`:
- Around line 82-90: The handler that reacts to `latest` needs a guard so
nothing is processed after a stream error has been raised: when you detect
`latest.type === 'error'` and dispatch the `type: 'stream_error'` action, also
mark the stream as finished (set the controller/state `isDone` flag) so
subsequent effect runs bail out early; implement this by either including
`isDone: true` in the same `stream_error` dispatch or immediately dispatching
the existing `done`/`isDone` action after dispatching `stream_error`, and add an
early return guard that checks `state.isDone` (or `state.streamError`) at the
top of the effect that processes `latest` so no further events are handled once
an error was emitted.
- Around line 112-137: The reconnection effect (createEffect) may re-attach
stale/errored streams because isDone() can be false for streams that errored;
update the loop that iterates getEntityStreams('chat', chatId) and before
calling attachStream(s) add an explicit check on the candidate stream's payload
for error indicators (e.g., inspect s's data/items or an error flag on s for
entries like type === 'error' or a non-null error field) and skip attaching if
any error item exists or a stream-level error is present; keep the existing
checks (currentStream?.isDone(), id, and messages() membership) and place the
new "has error" check just before attachStream to avoid re-attaching errored
streams.
- Around line 104-109: The current createEffect watching newStream.isDone() can
hang if no stream_end arrives; update the stream controller to detect network
failures/timeouts and force completion: add a fallback watcher (e.g., a timeout
or an observable on newStream's connection/error state) that checks for stalled
activity and, on timeout or connection error, builds a final message via
asChatMessage(newStream.data()) and dispatches a terminal action (e.g.,
dispatch({ type: 'stream_error' }) or dispatch({ type: 'stream_done', message
})) and mark the stream as completed so the controller exits the streaming
phase; ensure this logic ties into the same newStream instance and does not rely
solely on isDone() so silent disconnects are handled.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 3cc626bd-31d5-43d0-996e-5540e39b0a38
📒 Files selected for processing (7)
js/app/packages/block-chat/component/Chat.tsxjs/app/packages/core/component/AI/component/debug/Component.tsxjs/app/packages/core/component/AI/component/message/ChatMessages.tsxjs/app/packages/core/component/AI/context/ChatContext.tsxjs/app/packages/core/component/AI/context/index.tsjs/app/packages/core/component/AI/state/chatState.tsjs/app/packages/core/component/AI/state/createChatController.ts
| // Sync isGenerating from controller phase | ||
| createEffect(() => { | ||
| const chatStream = chat.stream(); | ||
| if (!chatStream || chatStream.isDone()) { | ||
| input.setIsGenerating(false); | ||
| return; | ||
| } | ||
| input.setIsGenerating(true); | ||
| if (chatStream.data().length > 0) invalidateUserQuota(); | ||
| input.setIsGenerating(chat.isGenerating()); | ||
| if (chat.isGenerating()) invalidateUserQuota(); | ||
| }); |
There was a problem hiding this comment.
Potential duplicate quota invalidation.
invalidateUserQuota() is called both when chat.isGenerating() becomes true (line 104) and immediately after attachStream (line 164). This could cause redundant invalidation calls. Consider removing one of them.
If the intent is to invalidate when streaming starts (line 104 handles this via isGenerating), the explicit call at line 164 appears redundant.
Proposed fix
chat.attachStream(result.stream);
- invalidateUserQuota();
});Also applies to: 163-164
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/block-chat/component/Chat.tsx` around lines 101 - 105, The
createEffect block synchronizes isGenerating and already calls
invalidateUserQuota() when chat.isGenerating() becomes true, so remove the
redundant explicit invalidateUserQuota() call from the attachStream path: locate
the attachStream-related code that calls invalidateUserQuota() (the call near
the attachStream handling) and delete that call so only the createEffect (with
input.setIsGenerating and chat.isGenerating) triggers quota invalidation; ensure
createEffect, input.setIsGenerating, chat.isGenerating, and invalidateUserQuota
remain unchanged.
| case 'send_failed': { | ||
| if (phase.type !== 'sending') { | ||
| console.warn(`chat transition: send_failed from ${phase.type}`); | ||
| return { phase, effects: [] }; | ||
| } | ||
| const effects: SideEffect[] = event.paymentError | ||
| ? [{ type: 'show_paywall' }] | ||
| : []; | ||
| return { phase: { type: 'idle' }, effects }; | ||
| } |
There was a problem hiding this comment.
Consider removing the optimistic message on send failure.
When send_failed occurs, the optimistic user message added during send_started remains in the messages array. This could leave a "ghost" message visible to the user even though the send failed.
Proposed fix to remove optimistic message on failure
case 'send_failed': {
if (phase.type !== 'sending') {
console.warn(`chat transition: send_failed from ${phase.type}`);
return { phase, effects: [] };
}
const effects: SideEffect[] = event.paymentError
? [{ type: 'show_paywall' }]
: [];
- return { phase: { type: 'idle' }, effects };
+ return {
+ phase: { type: 'idle' },
+ messages: (prev) => prev.filter((m) => m.id !== phase.optimisticMessageId),
+ effects,
+ };
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/core/component/AI/state/chatState.ts` around lines 73 - 82,
On send_failed you need to remove the optimistic user message that was added
during send_started to avoid leaving a ghost message; in the case handling for
'send_failed' (inside the reducer handling phase.type 'sending') update the
returned phase.messages to drop the last optimistic message (or filter out
messages with the optimistic flag/temporary id) before returning { phase: {
type: 'idle', messages: [...] }, effects }, and preserve the existing
show_paywall effect logic when event.paymentError is true. Ensure you reference
the same message identification used when creating the optimistic message (e.g.,
its temporary id or an optimistic boolean) so only the optimistic message is
removed.
| if (latest.type === 'error') { | ||
| const streamError = | ||
| 'stream_error' in latest ? latest.stream_error : undefined; | ||
| dispatch({ | ||
| type: 'stream_error', | ||
| streamError: streamError as string | undefined, | ||
| }); | ||
| return; | ||
| } |
There was a problem hiding this comment.
Error dispatch returns early without transitioning isDone.
When a stream error is detected and stream_error is dispatched, the effect returns early. However, if additional data arrives after the error (before cleanup), the effect will re-run and potentially dispatch duplicate events.
Consider adding a guard to prevent processing after an error has been dispatched.
Proposed guard pattern
+ let errorDispatched = false;
+
function attachStream(newStream: ChatMessageStream) {
setStream(newStream);
dispatch({ type: 'stream_connected' });
createEffect(
on(
() => newStream.data(),
(data) => {
+ if (errorDispatched) return;
const latest = data.at(-1);
if (!latest) return;
if (latest.type === 'error') {
+ errorDispatched = true;
const streamError =
'stream_error' in latest ? latest.stream_error : undefined;
dispatch({
type: 'stream_error',
streamError: streamError as string | undefined,
});
return;
}
// ...
}
)
);
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if (latest.type === 'error') { | |
| const streamError = | |
| 'stream_error' in latest ? latest.stream_error : undefined; | |
| dispatch({ | |
| type: 'stream_error', | |
| streamError: streamError as string | undefined, | |
| }); | |
| return; | |
| } | |
| let errorDispatched = false; | |
| function attachStream(newStream: ChatMessageStream) { | |
| setStream(newStream); | |
| dispatch({ type: 'stream_connected' }); | |
| createEffect( | |
| on( | |
| () => newStream.data(), | |
| (data) => { | |
| if (errorDispatched) return; | |
| const latest = data.at(-1); | |
| if (!latest) return; | |
| if (latest.type === 'error') { | |
| errorDispatched = true; | |
| const streamError = | |
| 'stream_error' in latest ? latest.stream_error : undefined; | |
| dispatch({ | |
| type: 'stream_error', | |
| streamError: streamError as string | undefined, | |
| }); | |
| return; | |
| } | |
| } | |
| ) | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/core/component/AI/state/createChatController.ts` around lines
82 - 90, The handler that reacts to `latest` needs a guard so nothing is
processed after a stream error has been raised: when you detect `latest.type ===
'error'` and dispatch the `type: 'stream_error'` action, also mark the stream as
finished (set the controller/state `isDone` flag) so subsequent effect runs bail
out early; implement this by either including `isDone: true` in the same
`stream_error` dispatch or immediately dispatching the existing `done`/`isDone`
action after dispatching `stream_error`, and add an early return guard that
checks `state.isDone` (or `state.streamError`) at the top of the effect that
processes `latest` so no further events are handled once an error was emitted.
| // Watch stream completion | ||
| createEffect(() => { | ||
| if (!newStream.isDone()) return; | ||
| const message = asChatMessage(newStream.data()); | ||
| dispatch({ type: 'stream_done', message }); | ||
| }); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check how stream errors are handled and whether isDone is set on error
ast-grep --pattern $'function streamIsDone($_, $_) {
$$$
}'Repository: macro-inc/macro
Length of output: 41
🏁 Script executed:
#!/bin/bash
# First, locate and read the createChatController.ts file
fd -t f "createChatController.ts" --exec wc -l {} \;Repository: macro-inc/macro
Length of output: 128
🏁 Script executed:
#!/bin/bash
# Read the entire createChatController.ts file to understand the context
fd -t f "createChatController.ts" --exec cat -n {} \;Repository: macro-inc/macro
Length of output: 5516
🏁 Script executed:
#!/bin/bash
# Search for the newStream type and isDone implementation
rg "isDone|newStream" -A 3 -B 3Repository: macro-inc/macro
Length of output: 24946
Stream controller may hang indefinitely if network error occurs without stream_end.
The completion effect (lines 104–109) only fires when isDone() returns true, which only happens when a stream_end item is received. If the network connection drops or fails without sending stream_end, the stream remains incomplete with isDone() permanently false, leaving the controller stuck in streaming phase. The error handling at lines 82–90 only catches errors that are explicitly added to the data array; a silent connection loss is not covered. Consider adding a timeout, connection failure detection, or explicit cleanup mechanism.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/core/component/AI/state/createChatController.ts` around lines
104 - 109, The current createEffect watching newStream.isDone() can hang if no
stream_end arrives; update the stream controller to detect network
failures/timeouts and force completion: add a fallback watcher (e.g., a timeout
or an observable on newStream's connection/error state) that checks for stalled
activity and, on timeout or connection error, builds a final message via
asChatMessage(newStream.data()) and dispatches a terminal action (e.g.,
dispatch({ type: 'stream_error' }) or dispatch({ type: 'stream_done', message
})) and mark the stream as completed so the controller exits the streaming
phase; ensure this logic ties into the same newStream instance and does not rely
solely on isDone() so silent disconnects are handled.
| // Reconnect active streams on page refresh / chat switch | ||
| createEffect(() => { | ||
| const activeStreams = getEntityStreams('chat', chatId)(); | ||
| const currentStream = untrack(stream); | ||
|
|
||
| for (const s of activeStreams) { | ||
| const sid = s.id()?.stream_id; | ||
| if (!sid) { | ||
| console.warn('reject chat stream: no id'); | ||
| continue; | ||
| } | ||
| if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) { | ||
| console.warn('reject chat stream: duplicate stream'); | ||
| continue; | ||
| } | ||
|
|
||
| const isInMessages = untrack(() => messages().some((m) => m.id === sid)); | ||
| if (isInMessages) { | ||
| console.warn('reject chat stream: already has message'); | ||
| continue; | ||
| } | ||
|
|
||
| attachStream(s); | ||
| break; | ||
| } | ||
| }); |
There was a problem hiding this comment.
Reconnection logic may attach stale streams.
The reconnection effect filters out streams where currentStream?.isDone() is true, but per the stream implementation, a stream that errored without stream_end will have isDone() === false. This could cause the effect to re-attach an already-errored stream.
Consider also checking if the stream's data contains an error item before attaching.
Proposed additional check
for (const s of activeStreams) {
const sid = s.id()?.stream_id;
if (!sid) {
console.warn('reject chat stream: no id');
continue;
}
if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) {
console.warn('reject chat stream: duplicate stream');
continue;
}
+ // Skip streams that have already errored
+ const hasError = s.data().some((item) => item.type === 'error');
+ if (hasError) {
+ console.warn('reject chat stream: stream has error');
+ continue;
+ }
const isInMessages = untrack(() => messages().some((m) => m.id === sid));📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Reconnect active streams on page refresh / chat switch | |
| createEffect(() => { | |
| const activeStreams = getEntityStreams('chat', chatId)(); | |
| const currentStream = untrack(stream); | |
| for (const s of activeStreams) { | |
| const sid = s.id()?.stream_id; | |
| if (!sid) { | |
| console.warn('reject chat stream: no id'); | |
| continue; | |
| } | |
| if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) { | |
| console.warn('reject chat stream: duplicate stream'); | |
| continue; | |
| } | |
| const isInMessages = untrack(() => messages().some((m) => m.id === sid)); | |
| if (isInMessages) { | |
| console.warn('reject chat stream: already has message'); | |
| continue; | |
| } | |
| attachStream(s); | |
| break; | |
| } | |
| }); | |
| // Reconnect active streams on page refresh / chat switch | |
| createEffect(() => { | |
| const activeStreams = getEntityStreams('chat', chatId)(); | |
| const currentStream = untrack(stream); | |
| for (const s of activeStreams) { | |
| const sid = s.id()?.stream_id; | |
| if (!sid) { | |
| console.warn('reject chat stream: no id'); | |
| continue; | |
| } | |
| if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) { | |
| console.warn('reject chat stream: duplicate stream'); | |
| continue; | |
| } | |
| // Skip streams that have already errored | |
| const hasError = s.data().some((item) => item.type === 'error'); | |
| if (hasError) { | |
| console.warn('reject chat stream: stream has error'); | |
| continue; | |
| } | |
| const isInMessages = untrack(() => messages().some((m) => m.id === sid)); | |
| if (isInMessages) { | |
| console.warn('reject chat stream: already has message'); | |
| continue; | |
| } | |
| attachStream(s); | |
| break; | |
| } | |
| }); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/core/component/AI/state/createChatController.ts` around lines
112 - 137, The reconnection effect (createEffect) may re-attach stale/errored
streams because isDone() can be false for streams that errored; update the loop
that iterates getEntityStreams('chat', chatId) and before calling
attachStream(s) add an explicit check on the candidate stream's payload for
error indicators (e.g., inspect s's data/items or an error flag on s for entries
like type === 'error' or a non-null error field) and skip attaching if any error
item exists or a stream-level error is present; keep the existing checks
(currentStream?.isDone(), id, and messages() membership) and place the new "has
error" check just before attachStream to avoid re-attaching errored streams.
73eab3e to
19e7d52
Compare
19e7d52 to
d6b888a
Compare
There was a problem hiding this comment.
Actionable comments posted: 6
♻️ Duplicate comments (1)
js/app/packages/core/component/AI/state/chatState.ts (1)
72-77:⚠️ Potential issue | 🟠 MajorRemove the optimistic message on
send_failed.
send_startedappends a temporary user message, but this branch only flips the phase back toidle. A failed request leaves a ghost message in the transcript, and the next retry no longer looks like a first message.Proposed fix
- .with([{ type: 'sending' }, { type: 'send_failed' }], ([, e]) => ({ + .with([{ type: 'sending' }, { type: 'send_failed' }], ([p, e]) => ({ phase: { type: 'idle' as const }, + messages: (prev: ChatMessageWithAttachments[]) => + prev.filter((m) => m.id !== p.optimisticMessageId), effects: e.paymentError ? ([{ type: 'show_paywall' }] as SideEffect[]) : [], }))🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/chatState.ts` around lines 72 - 77, The send_failed branch currently flips the phase to idle but leaves the optimistic temporary user message appended by send_started, so update the handler for the transition from ['sending','send_failed'] (the .with([...], ([, e]) => ...) block) to also remove the temporary/optimistic message from the transcript; specifically, detect and drop the temp message (e.g. messages with an isTemporary flag or the temp message id pattern created in send_started) before returning the new state and effects (keeping the existing paymentError -> show_paywall logic).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@js/app/packages/block-chat/component/Chat.tsx`:
- Around line 126-138: The onSend handler must abort early if the controller
isn't idle to avoid racing a second submit: before dispatching 'send_started'
and before calling sendChatMessage, check the controller state (the same idle
check used when handling stream_connected) and return immediately if not idle;
update the createCallback for onSend (and the similar block around lines where
sendChatMessage is invoked) to guard against non-idle controllers and do not
proceed with renaming or starting the send when the controller is busy.
In `@js/app/packages/core/component/AI/component/debug/Component.tsx`:
- Around line 235-243: The debug branch is appending messages with id: '' which
causes duplicate DOM ids in ChatMessages and breaks chat controller
reconciliation; change the chat.setMessages call (and any debug message
creation) to assign a real temporary unique id (e.g. generated via a short UUID
or timestamp+random string) when creating the message so message.id is non-empty
and unique, and ensure whatever path later replaces the temporary id updates the
message by matching that temp id; update the code around chat.setMessages and
any code that expects message.id to use that generated id instead of an empty
string.
In `@js/app/packages/core/component/AI/component/message/ChatMessages.tsx`:
- Around line 88-92: The isStream helper spams the console because it logs on
every evaluation; remove the console.log('isStream', Boolean(s)) call from the
isStream function (which uses stream()) so it simply returns false when stream()
is falsy and otherwise returns !s.isDone(), leaving no debug logging in the
render path.
In `@js/app/packages/core/component/AI/context/ChatContext.tsx`:
- Around line 112-119: The ChatProvider currently instantiates controller once
with createChatController(props.chatId, ...) which closes over a stale chatId
and causes createEffect and stream reconnections to target the wrong chat when
switching chats; update the provider to recreate the controller whenever
props.chatId changes by wrapping the creation in a reactive memo (e.g., use
createMemo around the createChatController call) or otherwise remounting the
provider for a new chatId so ChatCtx.Provider receives a fresh controller;
specifically change the instantiation of controller used as the value for
<ChatCtx.Provider> so it depends on props.chatId (and props.messages if you want
initial seed behavior) rather than being a one-time constant.
In `@js/app/packages/core/component/AI/state/chatState.ts`:
- Around line 79-85: The stream_connected handler currently drops
optimisticMessageId (when transitioning to phase 'streaming'), which forces
deduplication to rely on message content (prev.find(...)) and can match the
wrong past message; keep the optimisticMessageId in the state when handling the
transition to stream_connected (and the parallel handlers at the other block
around the 92–109 region) so the reconcile path can match incoming real
messageId to optimisticMessageId instead of only content, and update the
reconciliation logic to prefer matching by optimisticMessageId before falling
back to content de-duplication.
In `@js/app/packages/core/component/AI/state/createChatController.ts`:
- Around line 120-123: The duplicate-stream guard currently only runs when
currentStream?.isDone(), which allows attachStream(s) to reattach observers to
an active stream; change the condition to reject duplicates regardless of
completion by checking only the stream identity: replace the if that uses
currentStream?.isDone() && currentStream?.id()?.stream_id === sid with a simple
identity check currentStream?.id()?.stream_id === sid, log the duplicate and
continue, so attachStream(s) won't install a second observer on the same stream.
---
Duplicate comments:
In `@js/app/packages/core/component/AI/state/chatState.ts`:
- Around line 72-77: The send_failed branch currently flips the phase to idle
but leaves the optimistic temporary user message appended by send_started, so
update the handler for the transition from ['sending','send_failed'] (the
.with([...], ([, e]) => ...) block) to also remove the temporary/optimistic
message from the transcript; specifically, detect and drop the temp message
(e.g. messages with an isTemporary flag or the temp message id pattern created
in send_started) before returning the new state and effects (keeping the
existing paymentError -> show_paywall logic).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: a7301c9f-a7e1-4a72-8ef2-3993413ea3e8
📒 Files selected for processing (9)
js/app/packages/app/component/SoupChatInput.tsxjs/app/packages/block-chat/component/Chat.tsxjs/app/packages/core/component/AI/component/debug/Component.tsxjs/app/packages/core/component/AI/component/input/ChatInput.tsxjs/app/packages/core/component/AI/component/message/ChatMessages.tsxjs/app/packages/core/component/AI/context/ChatContext.tsxjs/app/packages/core/component/AI/context/index.tsjs/app/packages/core/component/AI/state/chatState.tsjs/app/packages/core/component/AI/state/createChatController.ts
| const onSend = createCallback(async (request: ChatSendInput) => { | ||
| const isFirstMessage = chat.messages().length === 0; | ||
| const optimisticId = crypto.randomUUID(); | ||
|
|
||
| chat.addMessage({ | ||
| id: crypto.randomUUID(), | ||
| content: request.content, | ||
| role: 'user', | ||
| attachments: request.attachments ?? [], | ||
| chat.dispatch({ | ||
| type: 'send_started', | ||
| optimisticMessage: { | ||
| id: optimisticId, | ||
| content: request.content, | ||
| role: 'user', | ||
| attachments: request.attachments ?? [], | ||
| }, | ||
| }); |
There was a problem hiding this comment.
Abort onSend when the controller is not idle.
dispatch({ type: 'send_started' }) can reject the transition, but this function still renames the chat and calls sendChatMessage(...). Because the input is only marked generating after stream_connected, a fast second submit can slip through here and start a concurrent request against a non-idle controller.
Proposed fix
const onSend = createCallback(async (request: ChatSendInput) => {
+ if (chat.phase().type !== 'idle') return;
+
const isFirstMessage = chat.messages().length === 0;
const optimisticId = crypto.randomUUID();
chat.dispatch({Also applies to: 150-160
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/block-chat/component/Chat.tsx` around lines 126 - 138, The
onSend handler must abort early if the controller isn't idle to avoid racing a
second submit: before dispatching 'send_started' and before calling
sendChatMessage, check the controller state (the same idle check used when
handling stream_connected) and return immediately if not idle; update the
createCallback for onSend (and the similar block around lines where
sendChatMessage is invoked) to guard against non-idle controllers and do not
proceed with renaming or starting the send when the controller is busy.
| chat.setMessages((p) => [ | ||
| ...p, | ||
| { | ||
| attachments: input.attachments, | ||
| content: input.content, | ||
| role: 'user', | ||
| id: '', | ||
| }, | ||
| ]); |
There was a problem hiding this comment.
Use a real temporary message id here.
ChatMessages builds DOM ids from message.id, and the chat controller uses ids for reconciliation/dedupe. Appending every debug user message with id: '' produces duplicate id="chat-" nodes after the second send and makes later matching ambiguous.
Proposed fix
{
attachments: input.attachments,
content: input.content,
role: 'user',
- id: '',
+ id: crypto.randomUUID(),
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| chat.setMessages((p) => [ | |
| ...p, | |
| { | |
| attachments: input.attachments, | |
| content: input.content, | |
| role: 'user', | |
| id: '', | |
| }, | |
| ]); | |
| chat.setMessages((p) => [ | |
| ...p, | |
| { | |
| attachments: input.attachments, | |
| content: input.content, | |
| role: 'user', | |
| id: crypto.randomUUID(), | |
| }, | |
| ]); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/core/component/AI/component/debug/Component.tsx` around lines
235 - 243, The debug branch is appending messages with id: '' which causes
duplicate DOM ids in ChatMessages and breaks chat controller reconciliation;
change the chat.setMessages call (and any debug message creation) to assign a
real temporary unique id (e.g. generated via a short UUID or timestamp+random
string) when creating the message so message.id is non-empty and unique, and
ensure whatever path later replaces the temporary id updates the message by
matching that temp id; update the code around chat.setMessages and any code that
expects message.id to use that generated id instead of an empty string.
| .with( | ||
| [{ type: P.union('idle', 'sending') }, { type: 'stream_connected' }], | ||
| () => ({ | ||
| phase: { type: 'streaming' as const }, | ||
| effects: [], | ||
| }) | ||
| ) |
There was a problem hiding this comment.
Preserve the optimistic id until the echoed user message arrives.
stream_connected discards optimisticMessageId, so this handler can only dedupe by content. If the user sends the same text twice, prev.find(...) can match the older message and leave the optimistic row unreconciled with its real messageId.
Proposed fix
export type ChatPhase =
| { type: 'idle' }
| { type: 'sending'; optimisticMessageId: string }
- | { type: 'streaming' };
+ | { type: 'streaming'; optimisticMessageId?: string };
- .with(
- [{ type: P.union('idle', 'sending') }, { type: 'stream_connected' }],
- () => ({
- phase: { type: 'streaming' as const },
- effects: [],
- })
- )
+ .with([{ type: 'idle' }, { type: 'stream_connected' }], () => ({
+ phase: { type: 'streaming' as const },
+ effects: [],
+ }))
+ .with([{ type: 'sending' }, { type: 'stream_connected' }], ([p]) => ({
+ phase: {
+ type: 'streaming' as const,
+ optimisticMessageId: p.optimisticMessageId,
+ },
+ effects: [],
+ }))
.with(
[
{ type: P.union('sending', 'streaming') },
{ type: 'stream_user_message' },
],
([p, e]) => {
- const optimisticId =
- p.type === 'sending' ? p.optimisticMessageId : undefined;
return {
phase: { type: 'streaming' as const },
messages: (prev: ChatMessageWithAttachments[]) => {
- if (prev.find((m) => m.role === 'user' && m.content === e.content))
+ if ('optimisticMessageId' in p && p.optimisticMessageId) {
+ return prev.map((m) =>
+ m.id === p.optimisticMessageId
+ ? {
+ ...m,
+ id: e.messageId,
+ content: e.content,
+ attachments: e.attachments,
+ }
+ : m
+ );
+ }
+ if (prev.some((m) => m.id === e.messageId))
return prev;
return [
...prev,Also applies to: 92-109
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/core/component/AI/state/chatState.ts` around lines 79 - 85,
The stream_connected handler currently drops optimisticMessageId (when
transitioning to phase 'streaming'), which forces deduplication to rely on
message content (prev.find(...)) and can match the wrong past message; keep the
optimisticMessageId in the state when handling the transition to
stream_connected (and the parallel handlers at the other block around the 92–109
region) so the reconcile path can match incoming real messageId to
optimisticMessageId instead of only content, and update the reconciliation logic
to prefer matching by optimisticMessageId before falling back to content
de-duplication.
| if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) { | ||
| console.warn('reject chat stream: duplicate stream'); | ||
| continue; | ||
| } |
There was a problem hiding this comment.
Reject the current stream regardless of isDone().
This guard only skips duplicates after the current stream has finished. If the reconnection effect reruns while the same stream is still active, attachStream(s) runs again and installs a second observer pair on the same stream.
Proposed fix
- if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) {
+ if (currentStream?.id()?.stream_id === sid) {
console.warn('reject chat stream: duplicate stream');
continue;
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@js/app/packages/core/component/AI/state/createChatController.ts` around lines
120 - 123, The duplicate-stream guard currently only runs when
currentStream?.isDone(), which allows attachStream(s) to reattach observers to
an active stream; change the condition to reject duplicates regardless of
completion by checking only the stream identity: replace the if that uses
currentStream?.isDone() && currentStream?.id()?.stream_id === sid with a simple
identity check currentStream?.id()?.stream_id === sid, log the duplicate and
continue, so attachStream(s) won't install a second observer on the same stream.
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
js/app/packages/core/component/AI/component/message/ChatMessages.tsx (1)
1-10:⚠️ Potential issue | 🟡 MinorRemove the stale
toastimport.Stream error handling moved into the controller, so this import is now unused and Biome is failing the file.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/component/message/ChatMessages.tsx` around lines 1 - 10, Remove the stale unused import of toast from ChatMessages.tsx: delete the "import { toast } from '@core/component/Toast/Toast';" entry and ensure no remaining references to the toast symbol exist in this module (e.g., within functions or JSX in ChatMessages, asChatMessage, useChatContext usages). After removing the import, tidy up the surrounding import block (commas/line breaks) so the file compiles cleanly.
♻️ Duplicate comments (7)
js/app/packages/core/component/AI/component/debug/Component.tsx (1)
235-243:⚠️ Potential issue | 🟡 MinorUse a non-empty temporary id for debug sends.
ChatMessagesbuilds DOM ids frommessage.id, and controller reconciliation also keys off ids. Reusingid: ''creates duplicatechat-nodes after the second send and makes later matching ambiguous.Suggested patch
{ attachments: input.attachments, content: input.content, role: 'user', - id: '', + id: crypto.randomUUID(), },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/component/debug/Component.tsx` around lines 235 - 243, The debug send is using an empty id which causes duplicate DOM ids and reconciliation issues in ChatMessages; change the object passed to chat.setMessages (the block that currently sets id: '') to assign a non-empty, unique temporary id (e.g. use crypto.randomUUID() or a timestamp/prefix like "debug-<ts>") so each appended message has a stable, unique message.id for DOM keys and controller matching; update the creation site in the chat.setMessages callback and ensure any downstream code that expects message.id accepts the temporary format.js/app/packages/core/component/AI/component/message/ChatMessages.tsx (1)
89-93:⚠️ Potential issue | 🟡 MinorDrop the render-path console log.
isStream()is evaluated repeatedly during normal rendering, so this will spam the console on every chat update.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/component/message/ChatMessages.tsx` around lines 89 - 93, The isStream() helper is logging on every render and causing spam; remove the debug console.log inside isStream (the line logging 'isStream' and Boolean(s)) so the function simply returns false when stream() is falsy and otherwise returns !s.isDone(); update the isStream implementation (which calls stream() and uses s.isDone()) to have no side-effect logging.js/app/packages/core/component/AI/state/chatState.ts (2)
79-85:⚠️ Potential issue | 🟠 MajorPreserve
optimisticMessageIdintostreamingand reconcile by id.
stream_connecteddrops the temp id, sostream_user_messagecan only dedupe onm.content === e.content. If the user sends the same text twice, this can match an older message and leave the optimistic row unreconciled with its realmessageIdand attachments. Carry the optimistic id intostreaming, patch that specific message when the echo arrives, and only fall back tomessageIddedupe afterward. This also resolves the current unusedoptimisticIdwarning.Also applies to: 92-109
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/chatState.ts` around lines 79 - 85, The transition that creates the streaming state currently drops the optimistic message id, so update the .with([... { type: 'stream_connected' } ...]) handler that returns phase: { type: 'streaming' } to preserve the optimisticMessageId (or optimisticId) into the streaming state's payload; then update the handler for the stream_user_message echo to first reconcile by that optimistic id (patch that specific optimistic message row with the real messageId and attachments) and only if no optimistic id match is found fall back to deduping by messageId/content. Apply the same change to the other streaming-creation branches (the block covering the 92-109 case) to remove the unused optimisticId warning and ensure correct reconciliation by id.
72-77:⚠️ Potential issue | 🟠 MajorRemove the optimistic row on
send_failed.This transition goes back to
idlewithout patchingmessages, so a failed send leaves an unsent user message in the transcript. Filterp.optimisticMessageIdout before returning.Suggested patch
- .with([{ type: 'sending' }, { type: 'send_failed' }], ([, e]) => ({ + .with([{ type: 'sending' }, { type: 'send_failed' }], ([p, e]) => ({ phase: { type: 'idle' as const }, + messages: (prev: ChatMessageWithAttachments[]) => + prev.filter((m) => m.id !== p.optimisticMessageId), effects: e.paymentError ? ([{ type: 'show_paywall' }] as SideEffect[]) : [], }))🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/chatState.ts` around lines 72 - 77, The transition handler for the combined ['sending','send_failed'] branch returns to idle without removing the optimistic user message, leaving an unsent message in the transcript; update the returned state inside the .with([...], ([, e]) => ({ ... })) callback to filter out the optimistic row using p.optimisticMessageId (e.g. messages: p.messages.filter(m => m.id !== p.optimisticMessageId)) and clear/reset optimisticMessageId so the stale optimistic message is removed when send_failed occurs; keep the existing paymentError logic for effects.js/app/packages/block-chat/component/Chat.tsx (2)
101-105:⚠️ Potential issue | 🟡 MinorQuota invalidation is firing multiple times per send.
The phase-sync effect already invalidates once
stream_connectedmoves the controller intostreaming, andstream_user_messagewill retrigger it becausephaseis replaced with another{ type: 'streaming' }object. The explicit call afterattachStream()adds another duplicate invalidation on top.Also applies to: 163-164
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/block-chat/component/Chat.tsx` around lines 101 - 105, The createEffect that syncs isGenerating (calling input.setIsGenerating(chat.isGenerating())) is triggering invalidateUserQuota redundantly because other code paths (e.g., stream_connected and stream_user_message via attachStream) already cause quota invalidation; remove the explicit invalidateUserQuota() call from this createEffect (or add a guard so it only runs when the phase transitions to streaming the first time) so that invalidateUserQuota is only invoked by the streaming lifecycle handlers (e.g., attachStream / stream_connected / stream_user_message) and not from the general isGenerating sync.
126-153:⚠️ Potential issue | 🟠 MajorAbort
onSendunless the controller is idle.
dispatch({ type: 'send_started' })can reject the transition, but this function still renames the chat and callssendChatMessage(). Because the input only flips generating afterstream_connected, a fast second submit can still start a concurrent request here.Suggested patch
const onSend = createCallback(async (request: ChatSendInput) => { + if (chat.phase().type !== 'idle') return; + const isFirstMessage = chat.messages().length === 0; const optimisticId = crypto.randomUUID();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/block-chat/component/Chat.tsx` around lines 126 - 153, Before doing optimistic dispatch, renaming, or calling sendChatMessage in onSend, check the chat controller is idle and abort otherwise: read the controller state (e.g., via chat.controller() or chat.controller?.state/isIdle) and return early if it is not 'idle'; also wrap the chat.dispatch({ type: 'send_started', ... }) in a try/catch (or check its return) and if the transition is rejected, stop further work (do not call renameMutation.mutate or sendChatMessage). This ensures onSend only proceeds when the controller is idle and prevents concurrent sends.js/app/packages/core/component/AI/state/createChatController.ts (1)
120-123:⚠️ Potential issue | 🟠 MajorReject the current stream by id, even while it is still active.
This guard only skips duplicates after
isDone(). If the reconnection effect reruns while the same stream is still live,attachStream(s)installs a second observer pair on it.Suggested patch
- if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) { + if (currentStream?.id()?.stream_id === sid) { console.warn('reject chat stream: duplicate stream'); continue; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/createChatController.ts` around lines 120 - 123, The duplicate-stream guard only runs when currentStream.isDone(), allowing attachStream(s) to add a second observer while a stream is still live; change the condition to detect matching stream id regardless of isDone so duplicates are rejected immediately. In createChatController.ts update the if that references currentStream, isDone(), currentStream.id()?.stream_id and sid so it checks currentStream?.id()?.stream_id === sid first (or remove the isDone() requirement), log the duplicate and continue, preventing attachStream(s) from installing a second observer on the same stream.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@js/app/packages/core/component/AI/component/message/ChatMessages.tsx`:
- Around line 1-10: Remove the stale unused import of toast from
ChatMessages.tsx: delete the "import { toast } from
'@core/component/Toast/Toast';" entry and ensure no remaining references to the
toast symbol exist in this module (e.g., within functions or JSX in
ChatMessages, asChatMessage, useChatContext usages). After removing the import,
tidy up the surrounding import block (commas/line breaks) so the file compiles
cleanly.
---
Duplicate comments:
In `@js/app/packages/block-chat/component/Chat.tsx`:
- Around line 101-105: The createEffect that syncs isGenerating (calling
input.setIsGenerating(chat.isGenerating())) is triggering invalidateUserQuota
redundantly because other code paths (e.g., stream_connected and
stream_user_message via attachStream) already cause quota invalidation; remove
the explicit invalidateUserQuota() call from this createEffect (or add a guard
so it only runs when the phase transitions to streaming the first time) so that
invalidateUserQuota is only invoked by the streaming lifecycle handlers (e.g.,
attachStream / stream_connected / stream_user_message) and not from the general
isGenerating sync.
- Around line 126-153: Before doing optimistic dispatch, renaming, or calling
sendChatMessage in onSend, check the chat controller is idle and abort
otherwise: read the controller state (e.g., via chat.controller() or
chat.controller?.state/isIdle) and return early if it is not 'idle'; also wrap
the chat.dispatch({ type: 'send_started', ... }) in a try/catch (or check its
return) and if the transition is rejected, stop further work (do not call
renameMutation.mutate or sendChatMessage). This ensures onSend only proceeds
when the controller is idle and prevents concurrent sends.
In `@js/app/packages/core/component/AI/component/debug/Component.tsx`:
- Around line 235-243: The debug send is using an empty id which causes
duplicate DOM ids and reconciliation issues in ChatMessages; change the object
passed to chat.setMessages (the block that currently sets id: '') to assign a
non-empty, unique temporary id (e.g. use crypto.randomUUID() or a
timestamp/prefix like "debug-<ts>") so each appended message has a stable,
unique message.id for DOM keys and controller matching; update the creation site
in the chat.setMessages callback and ensure any downstream code that expects
message.id accepts the temporary format.
In `@js/app/packages/core/component/AI/component/message/ChatMessages.tsx`:
- Around line 89-93: The isStream() helper is logging on every render and
causing spam; remove the debug console.log inside isStream (the line logging
'isStream' and Boolean(s)) so the function simply returns false when stream() is
falsy and otherwise returns !s.isDone(); update the isStream implementation
(which calls stream() and uses s.isDone()) to have no side-effect logging.
In `@js/app/packages/core/component/AI/state/chatState.ts`:
- Around line 79-85: The transition that creates the streaming state currently
drops the optimistic message id, so update the .with([... { type:
'stream_connected' } ...]) handler that returns phase: { type: 'streaming' } to
preserve the optimisticMessageId (or optimisticId) into the streaming state's
payload; then update the handler for the stream_user_message echo to first
reconcile by that optimistic id (patch that specific optimistic message row with
the real messageId and attachments) and only if no optimistic id match is found
fall back to deduping by messageId/content. Apply the same change to the other
streaming-creation branches (the block covering the 92-109 case) to remove the
unused optimisticId warning and ensure correct reconciliation by id.
- Around line 72-77: The transition handler for the combined
['sending','send_failed'] branch returns to idle without removing the optimistic
user message, leaving an unsent message in the transcript; update the returned
state inside the .with([...], ([, e]) => ({ ... })) callback to filter out the
optimistic row using p.optimisticMessageId (e.g. messages: p.messages.filter(m
=> m.id !== p.optimisticMessageId)) and clear/reset optimisticMessageId so the
stale optimistic message is removed when send_failed occurs; keep the existing
paymentError logic for effects.
In `@js/app/packages/core/component/AI/state/createChatController.ts`:
- Around line 120-123: The duplicate-stream guard only runs when
currentStream.isDone(), allowing attachStream(s) to add a second observer while
a stream is still live; change the condition to detect matching stream id
regardless of isDone so duplicates are rejected immediately. In
createChatController.ts update the if that references currentStream, isDone(),
currentStream.id()?.stream_id and sid so it checks
currentStream?.id()?.stream_id === sid first (or remove the isDone()
requirement), log the duplicate and continue, preventing attachStream(s) from
installing a second observer on the same stream.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 903ea1f2-df8e-4659-849c-2dfb55c6144a
📒 Files selected for processing (9)
js/app/packages/app/component/SoupChatInput.tsxjs/app/packages/block-chat/component/Chat.tsxjs/app/packages/core/component/AI/component/debug/Component.tsxjs/app/packages/core/component/AI/component/input/ChatInput.tsxjs/app/packages/core/component/AI/component/message/ChatMessages.tsxjs/app/packages/core/component/AI/context/ChatContext.tsxjs/app/packages/core/component/AI/context/index.tsjs/app/packages/core/component/AI/state/chatState.tsjs/app/packages/core/component/AI/state/createChatController.ts
tspattern
d6b888a to
67d6203
Compare
There was a problem hiding this comment.
♻️ Duplicate comments (8)
js/app/packages/core/component/AI/state/chatState.ts (2)
79-85:⚠️ Potential issue | 🟠 Major
optimisticMessageIdis discarded onstream_connected, breaking reconciliation.When transitioning from
sendingtostreaming, theoptimisticMessageIdis lost. This forcesstream_user_message(lines 96-97) to rely on content-based deduplication (m.role === 'user' && m.content === e.content), which can match the wrong message if the user sends identical text twice.Preserve
optimisticMessageIdthrough the streaming phase to enable id-based reconciliation.Proposed fix
export type ChatPhase = | { type: 'idle' } | { type: 'sending'; optimisticMessageId: string } - | { type: 'streaming' }; + | { type: 'streaming'; optimisticMessageId?: string }; .with( [{ type: P.union('idle', 'sending') }, { type: 'stream_connected' }], - () => ({ - phase: { type: 'streaming' as const }, + ([p]) => ({ + phase: { + type: 'streaming' as const, + optimisticMessageId: p.type === 'sending' ? p.optimisticMessageId : undefined, + }, effects: [], }) )Then update
stream_user_messageto reconcile by id:.with( [ { type: P.union('sending', 'streaming') }, { type: 'stream_user_message' }, ], - ([, e]) => { + ([p, e]) => { + const optimisticId = 'optimisticMessageId' in p ? p.optimisticMessageId : undefined; return { phase: { type: 'streaming' as const }, messages: (prev: ChatMessageWithAttachments[]) => { - if (prev.find((m) => m.role === 'user' && m.content === e.content)) - return prev; + if (optimisticId) { + return prev.map((m) => + m.id === optimisticId + ? { ...m, id: e.messageId, content: e.content, attachments: e.attachments } + : m + ); + } + if (prev.some((m) => m.id === e.messageId)) return prev; return [ ...prev,🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/chatState.ts` around lines 79 - 85, The transition handler for the "stream_connected" event currently replaces the state and drops optimisticMessageId, causing reconciliation to fall back to content-based matching; update the .with branch that matches [{ type: P.union('idle', 'sending') }, { type: 'stream_connected' }] so it returns phase: { type: 'streaming' } but preserves optimisticMessageId (carry over the existing optimisticMessageId value from the previous state) and keep effects array as before; then ensure stream_user_message continues to use optimisticMessageId for reconciliation (reconcile by id instead of matching m.role/m.content) so in-flight user messages are matched by id.
72-77:⚠️ Potential issue | 🟠 MajorOptimistic message persists after send failure.
When
send_failedoccurs, the optimistic user message added duringsend_startedremains in the messages array. This leaves a "ghost" message visible to the user even though the send failed. The transition should remove the optimistic message.Proposed fix to remove optimistic message on failure
- .with([{ type: 'sending' }, { type: 'send_failed' }], ([, e]) => ({ - phase: { type: 'idle' as const }, - effects: e.paymentError - ? ([{ type: 'show_paywall' }] as SideEffect[]) - : [], - })) + .with([{ type: 'sending' }, { type: 'send_failed' }], ([p, e]) => ({ + phase: { type: 'idle' as const }, + messages: (prev: ChatMessageWithAttachments[]) => + prev.filter((m) => m.id !== p.optimisticMessageId), + effects: e.paymentError + ? ([{ type: 'show_paywall' }] as SideEffect[]) + : [], + }))🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/chatState.ts` around lines 72 - 77, The send_failed transition currently leaves the optimistic user message in the messages array; update the .with([...], ([, e]) => ({ ... })) handler so that on send_failed you remove the optimistic message from state.messages (either by filtering out messages with an optimistic flag, e.g., messages = messages.filter(m => !m.optimistic), or by using the event’s temporary id if available, e.g., messages = messages.filter(m => m.id !== e.optimisticMessageId)), then return the new messages array along with the existing phase and effects; target the send_failed branch inside the .with handler to perform this removal.js/app/packages/block-chat/component/Chat.tsx (2)
101-105:⚠️ Potential issue | 🟡 MinorDuplicate quota invalidation calls.
invalidateUserQuota()is called both whenchat.isGenerating()becomes true (line 104) and immediately afterattachStream(line 164). This causes redundant invalidation.Since the effect at line 104 handles invalidation when streaming starts, the explicit call at line 164 appears redundant.
Proposed fix
chat.attachStream(result.stream); - invalidateUserQuota(); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/block-chat/component/Chat.tsx` around lines 101 - 105, The effect syncing isGenerating already calls invalidateUserQuota() when chat.isGenerating() becomes true, so remove the redundant explicit call to invalidateUserQuota() after attachStream; locate the attachStream logic (the function/method where attachStream is invoked) and delete the extra invalidateUserQuota() invocation, leaving the createEffect that uses input.setIsGenerating and chat.isGenerating() as the single place that triggers quota invalidation.
126-138:⚠️ Potential issue | 🟠 MajorGuard
onSendagainst non-idle phase.
dispatch({ type: 'send_started' })will reject the transition if not idle, but the function continues to rename the chat and callsendChatMessage(...). A fast double-submit can slip through and start a concurrent request.Proposed fix
const onSend = createCallback(async (request: ChatSendInput) => { + if (chat.phase().type !== 'idle') return; + const isFirstMessage = chat.messages().length === 0; const optimisticId = crypto.randomUUID(); chat.dispatch({🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/block-chat/component/Chat.tsx` around lines 126 - 138, The onSend handler can start a send when the chat is not idle, allowing a fast double-submit to trigger concurrent requests; before proceeding (before generating optimisticId, renaming the chat or calling sendChatMessage), guard the handler by checking the chat's current phase or the result of chat.dispatch('send_started') and return early if the chat is not in the idle state or if dispatch rejects; specifically, in the onSend function add a pre-check against chat.phase() (or handle the thrown error from chat.dispatch({ type: 'send_started', ... })) and stop further work (no rename, no call to sendChatMessage) when the phase is not 'idle' or dispatch failed.js/app/packages/core/component/AI/state/createChatController.ts (2)
72-99: 🧹 Nitpick | 🔵 TrivialConsider adding a guard to prevent processing after error dispatch.
Once
stream_erroris dispatched (line 82-86), subsequent data arrivals before cleanup could cause duplicate event dispatches. A local flag or checkingphasestate could prevent this.Proposed guard pattern
function attachStream(newStream: ChatMessageStream) { setStream(newStream); dispatch({ type: 'stream_connected' }); + let errorDispatched = false; createEffect( on( () => newStream.data(), (data) => { + if (errorDispatched) return; const latest = data.at(-1); if (!latest) return; if (latest.type === 'error') { + errorDispatched = true; const streamError = 'stream_error' in latest ? latest.stream_error : undefined; dispatch({ type: 'stream_error', streamError: streamError as string | undefined, }); return; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/createChatController.ts` around lines 72 - 99, Add a guard inside the createEffect callback that stops processing further stream events after an error has been dispatched: detect the 'stream_error' branch in the on(() => newStream.data(), ...) handler and set a local flag (e.g., errored = true) or check the controller's phase state before handling new items; then short-circuit early for any subsequent invocations so dispatch({... type: 'stream_user_message' ...}) (and other processing) is skipped after stream_error has been emitted. Ensure the flag is scoped to the createEffect closure or use the existing phase state check before handling latest.type to avoid duplicate dispatches.
120-123:⚠️ Potential issue | 🟠 MajorDuplicate stream rejection should not depend on
isDone().The guard only skips duplicates after the current stream has finished. If the reconnection effect reruns while the same stream is still active,
attachStream(s)runs again and installs duplicate observers.Proposed fix
- if (currentStream?.isDone() && currentStream?.id()?.stream_id === sid) { + if (currentStream?.id()?.stream_id === sid) { console.warn('reject chat stream: duplicate stream'); continue; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/state/createChatController.ts` around lines 120 - 123, The duplicate-stream guard in createChatController.ts should not rely on currentStream?.isDone(); change the check inside the loop where attachStream(s) is called so it rejects any stream whose id matches the active stream regardless of isDone. Locate the conditional using currentStream and sid (the lines with currentStream?.isDone() && currentStream?.id()?.stream_id === sid) and remove the isDone() part so it simply compares currentStream?.id()?.stream_id === sid, keeping the existing warn and continue behavior to avoid installing duplicate observers when the same stream is still active.js/app/packages/core/component/AI/component/debug/Component.tsx (1)
235-243:⚠️ Potential issue | 🟡 MinorUse a unique temporary message id.
ChatMessagesbuilds DOM ids frommessage.id, and the chat controller uses ids for reconciliation/deduplication. Usingid: ''produces duplicateid="chat-"nodes and makes matching ambiguous.Proposed fix
chat.setMessages((p) => [ ...p, { attachments: input.attachments, content: input.content, role: 'user', - id: '', + id: crypto.randomUUID(), }, ]);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/component/debug/Component.tsx` around lines 235 - 243, chat.setMessages is appending a new user message with id: '' which causes ChatMessages DOM ids and the chat controller reconciliation to collide; update the code that adds the temp message (the call to chat.setMessages(...) in Component.tsx) to assign a unique temporary id (e.g. via crypto.randomUUID() or a Timestamp+random suffix) to message.id before pushing so each new message has a distinct id for ChatMessages and the controller to deduplicate/reconcile correctly.js/app/packages/core/component/AI/component/message/ChatMessages.tsx (1)
88-93:⚠️ Potential issue | 🟡 MinorRemove debug console.log from render path.
The
console.log('isStream', Boolean(s))inisStream()will spam the console during normal chat usage since this accessor is evaluated multiple times per update.Proposed fix
const isStream = () => { const s = stream(); - console.log('isStream', Boolean(s)); if (!s) return false; return !s.isDone(); };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@js/app/packages/core/component/AI/component/message/ChatMessages.tsx` around lines 88 - 93, The isStream accessor in ChatMessages.tsx contains a debug console.log that spams the console; remove the console.log('isStream', Boolean(s)) call from the isStream function so the render path no longer logs each evaluation, leaving the rest of the logic (calling stream() and returning !s.isDone()) intact.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@js/app/packages/block-chat/component/Chat.tsx`:
- Around line 101-105: The effect syncing isGenerating already calls
invalidateUserQuota() when chat.isGenerating() becomes true, so remove the
redundant explicit call to invalidateUserQuota() after attachStream; locate the
attachStream logic (the function/method where attachStream is invoked) and
delete the extra invalidateUserQuota() invocation, leaving the createEffect that
uses input.setIsGenerating and chat.isGenerating() as the single place that
triggers quota invalidation.
- Around line 126-138: The onSend handler can start a send when the chat is not
idle, allowing a fast double-submit to trigger concurrent requests; before
proceeding (before generating optimisticId, renaming the chat or calling
sendChatMessage), guard the handler by checking the chat's current phase or the
result of chat.dispatch('send_started') and return early if the chat is not in
the idle state or if dispatch rejects; specifically, in the onSend function add
a pre-check against chat.phase() (or handle the thrown error from
chat.dispatch({ type: 'send_started', ... })) and stop further work (no rename,
no call to sendChatMessage) when the phase is not 'idle' or dispatch failed.
In `@js/app/packages/core/component/AI/component/debug/Component.tsx`:
- Around line 235-243: chat.setMessages is appending a new user message with id:
'' which causes ChatMessages DOM ids and the chat controller reconciliation to
collide; update the code that adds the temp message (the call to
chat.setMessages(...) in Component.tsx) to assign a unique temporary id (e.g.
via crypto.randomUUID() or a Timestamp+random suffix) to message.id before
pushing so each new message has a distinct id for ChatMessages and the
controller to deduplicate/reconcile correctly.
In `@js/app/packages/core/component/AI/component/message/ChatMessages.tsx`:
- Around line 88-93: The isStream accessor in ChatMessages.tsx contains a debug
console.log that spams the console; remove the console.log('isStream',
Boolean(s)) call from the isStream function so the render path no longer logs
each evaluation, leaving the rest of the logic (calling stream() and returning
!s.isDone()) intact.
In `@js/app/packages/core/component/AI/state/chatState.ts`:
- Around line 79-85: The transition handler for the "stream_connected" event
currently replaces the state and drops optimisticMessageId, causing
reconciliation to fall back to content-based matching; update the .with branch
that matches [{ type: P.union('idle', 'sending') }, { type: 'stream_connected'
}] so it returns phase: { type: 'streaming' } but preserves optimisticMessageId
(carry over the existing optimisticMessageId value from the previous state) and
keep effects array as before; then ensure stream_user_message continues to use
optimisticMessageId for reconciliation (reconcile by id instead of matching
m.role/m.content) so in-flight user messages are matched by id.
- Around line 72-77: The send_failed transition currently leaves the optimistic
user message in the messages array; update the .with([...], ([, e]) => ({ ...
})) handler so that on send_failed you remove the optimistic message from
state.messages (either by filtering out messages with an optimistic flag, e.g.,
messages = messages.filter(m => !m.optimistic), or by using the event’s
temporary id if available, e.g., messages = messages.filter(m => m.id !==
e.optimisticMessageId)), then return the new messages array along with the
existing phase and effects; target the send_failed branch inside the .with
handler to perform this removal.
In `@js/app/packages/core/component/AI/state/createChatController.ts`:
- Around line 72-99: Add a guard inside the createEffect callback that stops
processing further stream events after an error has been dispatched: detect the
'stream_error' branch in the on(() => newStream.data(), ...) handler and set a
local flag (e.g., errored = true) or check the controller's phase state before
handling new items; then short-circuit early for any subsequent invocations so
dispatch({... type: 'stream_user_message' ...}) (and other processing) is
skipped after stream_error has been emitted. Ensure the flag is scoped to the
createEffect closure or use the existing phase state check before handling
latest.type to avoid duplicate dispatches.
- Around line 120-123: The duplicate-stream guard in createChatController.ts
should not rely on currentStream?.isDone(); change the check inside the loop
where attachStream(s) is called so it rejects any stream whose id matches the
active stream regardless of isDone. Locate the conditional using currentStream
and sid (the lines with currentStream?.isDone() &&
currentStream?.id()?.stream_id === sid) and remove the isDone() part so it
simply compares currentStream?.id()?.stream_id === sid, keeping the existing
warn and continue behavior to avoid installing duplicate observers when the same
stream is still active.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 82561735-02dc-490f-b22f-f15e1509cff9
📒 Files selected for processing (9)
js/app/packages/app/component/SoupChatInput.tsxjs/app/packages/block-chat/component/Chat.tsxjs/app/packages/core/component/AI/component/debug/Component.tsxjs/app/packages/core/component/AI/component/input/ChatInput.tsxjs/app/packages/core/component/AI/component/message/ChatMessages.tsxjs/app/packages/core/component/AI/context/ChatContext.tsxjs/app/packages/core/component/AI/context/index.tsjs/app/packages/core/component/AI/state/chatState.tsjs/app/packages/core/component/AI/state/createChatController.ts
Use an explicit state machine to handle chat state