Skip to content

fix(ui): skip chat history reload during active sends to prevent mess…#66997

Merged
vincentkoc merged 5 commits intoopenclaw:mainfrom
scotthuang:fix/control-ui-optimistic-message-race
Apr 15, 2026
Merged

fix(ui): skip chat history reload during active sends to prevent mess…#66997
vincentkoc merged 5 commits intoopenclaw:mainfrom
scotthuang:fix/control-ui-optimistic-message-race

Conversation

@scotthuang
Copy link
Copy Markdown
Contributor

Summary

  • Problem: After upgrading to 4.12, user message cards in Control UI don't appear immediately after sending. They only become visible when the LLM starts streaming back, causing a perceived delay.
  • Why it matters: The optimistic update is a core UX pattern; losing it makes the chat feel sluggish and broken.
  • What changed: Added a guard in handleSessionMessageGatewayEvent() to skip loadChatHistory() while a chat run is active (host.chatRunId is set).
  • What did NOT change (scope boundary): External message refresh for idle sessions is preserved. No changes to chat event handling, streaming state management, or qa-lab testing infrastructure.

Change Type (select all)

  • Bug fix
  • Feature
  • Refactor required for the fix
  • Docs
  • Security hardening
  • Chore/infra

Scope (select all touched areas)

  • Gateway / orchestration
  • Skills / tool execution
  • Auth / tokens
  • Memory / storage
  • Integrations
  • API / contracts
  • UI / DX
  • CI/CD / infra

Linked Issue/PR

  • Related: commit 20266c14cb (added session.message event handling for qa-lab testing)
  • This PR fixes a bug or regression

Root Cause (if applicable)

  • Root cause: Commit 20266c14cb added a session.message gateway event handler that unconditionally calls loadChatHistory(). When a user sends a message, the optimistic update appends the user message card immediately. However, the gateway broadcasts a session.message event for the same message, triggering loadChatHistory() which resets chatStream to null. This clears the streaming state before the first LLM delta arrives, causing the optimistic message card to disappear.
  • Missing detection / guardrail: No guard to check whether a chat run is already active before reloading history on session.message events.
  • Contributing context (if known): The session.message event was added to support qa-lab external message injection testing. The race condition only manifests when the event arrives during the narrow window between the optimistic update and the first LLM streaming delta.

Regression Test Plan (if applicable)

  • Coverage level that should have caught this:
    • Unit test
    • Seam / integration test
    • End-to-end test
    • Existing coverage already sufficient
  • Target test or file: ui/src/ui/app-gateway.test.ts (if exists) or a new test for handleSessionMessageGatewayEvent
  • Scenario the test should lock in: When host.chatRunId is set (active send), a session.message event for the same session should NOT trigger loadChatHistory().
  • Why this is the smallest reliable guardrail: It directly tests the guard condition without requiring a full browser environment or WebSocket connection.
  • Existing test that already covers this (if any): None.
  • If no new test is added, why not: The function is a private internal handler in the gateway event dispatcher. Manual verification confirmed the fix. A follow-up test PR could add coverage if desired.

User-visible / Behavior Changes

  • User message cards now appear immediately after sending (restoring pre-4.14 behavior).
  • No change to idle session behavior: external messages still trigger history refresh when no chat run is active.

Diagram (if applicable)

Before (broken):
[user sends msg] -> [optimistic update: card shown]
                  -> [session.message event arrives]
                  -> [loadChatHistory() called unconditionally]
                  -> [chatStream = null, card disappears]
                  -> [first LLM delta arrives: card reappears]

After (fixed):
[user sends msg] -> [optimistic update: card shown]
                  -> [session.message event arrives]
                  -> [host.chatRunId is set -> skip loadChatHistory()]
                  -> [card stays visible throughout streaming]

Security Impact (required)

  • New permissions/capabilities? No
  • Secrets/tokens handling changed? No
  • New/changed network calls? No
  • Command/tool execution surface changed? No
  • Data access scope changed? No

Repro + Verification

Environment

  • OS: macOS
  • Runtime/container: Node 22+, pnpm dev gateway
  • Model/provider: Any (issue is UI-side, model-independent)
  • Integration/channel (if any): Control UI webchat
  • Relevant config (redacted): Default gateway config, gateway.mode=local

Steps

  1. Open Control UI at http://127.0.0.1:18789
  2. Type and send a message in chat
  3. Observe whether the user message card appears immediately

Expected

  • User message card appears instantly after pressing send, before LLM starts streaming.

Actual

  • Before fix (since 4.12): Message card is invisible for ~0.5-2s, only appearing when the first LLM delta arrives.
  • After fix: Message card appears immediately as expected (restoring pre-4.12 behavior).

Evidence

  • Screenshot/recording
    • Manual verification on local dev gateway running from fix branch confirmed immediate message card appearance.

Human Verification (required)

  • Verified scenarios:
    • Sent multiple messages in Control UI; all message cards appeared immediately.
    • Confirmed idle session external message refresh still works (no chatRunId set).
  • Edge cases checked:
    • Rapid consecutive sends: cards appear correctly in sequence.
  • What you did not verify:
    • qa-lab test suite with external message injection during active user send (narrow edge case, acceptable per design analysis).

Review Conversations

  • I replied to or resolved every bot review conversation I addressed in this PR.
  • I left unresolved only the conversations that still need reviewer or maintainer judgment.

Compatibility / Migration

  • Backward compatible? Yes
  • Config/env changes? No
  • Migration needed? No

Trade-off Analysis

This is the highest cost-effectiveness fix for this regression: a single if guard, zero refactoring, zero risk to existing event handling.

The only theoretical downside is: if a user is actively sending a message and an external message arrives in the same session at the exact same time, the external message won't appear until the current chat run completes and history reloads. This is an extremely narrow concurrent-input window that is unlikely in normal usage. Once the run finishes, loadChatHistory() is called by the terminal event handler, so the external message appears shortly after regardless.

A "perfect" fix would require splitting loadChatHistory() into an additive append path that merges new messages without resetting streaming state — significantly more invasive for a marginal edge case.

Risks and Mitigations

  • Risk: External message during active send is not immediately visible.
    • Mitigation: Only affects the narrow window of simultaneous user send + external input. History reloads automatically on run completion, so the delay is at most the duration of one chat run. Core optimistic update UX is fully preserved.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Apr 15, 2026

Greptile Summary

Adds an early-return guard in handleSessionMessageGatewayEvent to skip loadChatHistory() when host.chatRunId is set, fixing a regression where the optimistic user-message card disappeared mid-send because loadChatHistory() was unconditionally wiping chatStream to null on every session.message event. The fix is timing-safe: chatRunId is assigned synchronously in controllers/chat.ts before the async send, so it is always set before the gateway can echo back a session.message event. Deferred external messages are correctly picked up after run completion, since terminal chat events already trigger a loadChatHistory() call.

Confidence Score: 5/5

Safe to merge — single-condition guard with correct timing semantics and no behavioral regressions.

The only finding is a P2 suggestion to add a regression test; all logic is correct and the fix is well-reasoned. A missing test does not block merge.

No files require special attention.

Prompt To Fix All With AI
This is a comment left during a code review.
Path: ui/src/ui/app-gateway.ts
Line: 425-432

Comment:
**Missing regression test for the guard**

The existing `app-gateway.sessions.node.test.ts` already has a `describe("handleGatewayEvent session.message", ...)` block with mock infrastructure wired up. A third test case locking in the new guard would be trivial to add and would prevent this class of regression from going undetected again. The PR description flags this as a known gap and proposes a follow-up — the test infrastructure already exists, so this is the right place for it.

How can I resolve this? If you propose a fix, please make it concise.

Reviews (1): Last reviewed commit: "fix(ui): skip chat history reload during..." | Re-trigger Greptile

Comment thread ui/src/ui/app-gateway.ts
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: aecf83916e

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment thread ui/src/ui/app-gateway.ts
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 46083ddd0b

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment thread ui/src/ui/app-gateway.ts
scotthuang and others added 5 commits April 15, 2026 09:55
…age card delay

When a user sends a message, the Control UI performs an optimistic update to
display the message immediately. However, if a session.message gateway event
arrives during the send, it triggers an unconditional loadChatHistory() call.
This race condition resets chatStream to null, causing the optimistic message
card to disappear until the first LLM delta arrives.

The fix adds a guard to skip history reload while a chat run is active
(host.chatRunId is set). The chat event handler already manages streaming state
and appends the final assistant message, making the concurrent reload unnecessary
during active sends. External message refresh is preserved for idle sessions.

Fixes the regression introduced in commit 20266c1 (added session.message
event handling for qa-lab testing).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
… chat run

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@vincentkoc vincentkoc force-pushed the fix/control-ui-optimistic-message-race branch from 46083dd to cec28cf Compare April 15, 2026 08:55
@vincentkoc vincentkoc merged commit 7734a40 into openclaw:main Apr 15, 2026
11 checks passed
@vincentkoc
Copy link
Copy Markdown
Member

Merged via squash.

Thanks @scotthuang!

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: cec28cfa90

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread ui/src/ui/app-gateway.ts
Comment on lines +427 to +429
if (deferredSessionKey && payloadSessionKey && deferredSessionKey === payloadSessionKey) {
deferredReloadHost.pendingSessionMessageReloadSessionKey = null;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve deferred reload state until terminal event

Fresh evidence in this revision: this new reset path clears pendingSessionMessageReloadSessionKey on any same-session chat event, including delta. When a session.message is deferred during an active run and the run later emits one or more deltas before ending in aborted/error, the deferred key is nulled before terminal handling, so shouldReplayDeferredSessionMessageReload is false and loadChatHistory() is never replayed; concurrent external messages can remain missing until an unrelated refresh.

Useful? React with 👍 / 👎.

bminicore pushed a commit to bminicore/openclaw-fork that referenced this pull request Apr 15, 2026
openclaw#66997)

Merged via squash.

Prepared head SHA: cec28cf
Co-authored-by: scotthuang <1670837+scotthuang@users.noreply.github.com>
Co-authored-by: vincentkoc <25068+vincentkoc@users.noreply.github.com>
Reviewed-by: @vincentkoc
xudaiyanzi pushed a commit to xudaiyanzi/openclaw that referenced this pull request Apr 17, 2026
openclaw#66997)

Merged via squash.

Prepared head SHA: cec28cf
Co-authored-by: scotthuang <1670837+scotthuang@users.noreply.github.com>
Co-authored-by: vincentkoc <25068+vincentkoc@users.noreply.github.com>
Reviewed-by: @vincentkoc
kvnkho pushed a commit to kvnkho/openclaw that referenced this pull request Apr 17, 2026
openclaw#66997)

Merged via squash.

Prepared head SHA: cec28cf
Co-authored-by: scotthuang <1670837+scotthuang@users.noreply.github.com>
Co-authored-by: vincentkoc <25068+vincentkoc@users.noreply.github.com>
Reviewed-by: @vincentkoc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants