Skip to content

feat(examples-chat): turn on A2UI envelope streaming via callback handler#262

Merged
blove merged 2 commits into
mainfrom
claude/genui-streaming-partial-handler
May 12, 2026
Merged

feat(examples-chat): turn on A2UI envelope streaming via callback handler#262
blove merged 2 commits into
mainfrom
claude/genui-streaming-partial-handler

Conversation

@blove
Copy link
Copy Markdown
Contributor

@blove blove commented May 12, 2026

Summary

Final piece of the progressive A2UI streaming work. The A2uiPartialHandler async callback handler is attached to the generate node when gen_ui_mode='a2ui'. As the parent LLM streams tool_call_chunks for render_a2ui_surface (added by #259), the handler concatenates per-tool_call_id cumulative argument strings and dispatches a2ui-partial custom events via adispatch_custom_event. The frontend bridge (separate PR) consumes these and feeds the A2UI surface store envelope-by-envelope.

Result: the per-component fallback transition wired by #252 is now actually visible. Surface mounts on the first surfaceUpdate; components show their fallback while their dataModelUpdate envelopes stream; each component flips from fallback to real as its binding resolves.

Spec: docs/superpowers/specs/2026-05-12-genui-streaming-sub-llm-design.md.

Test plan

  • pytest tests/test_a2ui_partial_handler.py — 5 tests
  • pytest tests/test_streaming_smoke.py — 1 integration test against canned stream
  • Full pytest tests/ — 41 tests, all green
  • Live smoke at /embed after the frontend bridge PR merges: sample DOM via requestAnimationFrame and confirm frames where render-default-fallback count > 0
  • CI green

blove added 2 commits May 12, 2026 15:00
Async callback handler tracking per-tool_call_id cumulative arguments
from on_chat_model_stream events. Each growth in the cumulative string
dispatches an a2ui-partial custom event carrying {tool_call_id,
args_so_far}; the frontend partial-args-bridge consumes these and feeds
envelopes into the A2UI surface store as they parse.
Attached only when gen_ui_mode='a2ui'. Sidebands the parent LLM's
tool_call_chunks for render_a2ui_surface as a2ui-partial custom events.
Together with the frontend partial-args bridge (claude/genui-streaming-
frontend-bridge) and the envelope-tool refactor (claude/genui-streaming-
envelope-tool), this realises the per-component fallback transition
wired by PR #252 — surface mounts on first surfaceUpdate, components
flip from fallback to real as dataModelUpdates stream in.
@vercel
Copy link
Copy Markdown

vercel Bot commented May 12, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
cacheplane Ready Ready Preview, Comment May 12, 2026 10:08pm

Request Review

@blove blove merged commit 95460ff into main May 12, 2026
14 checks passed
blove added a commit that referenced this pull request May 12, 2026
…at_model_stream) (#264)

PR #262 originally hooked into on_chat_model_stream, but that method
does not exist on AsyncCallbackHandler — the canonical streaming-token
callback (fired by ChatOpenAI when streaming=True) is on_llm_new_token,
which delivers each token plus an optional ChatGenerationChunk whose
message field carries the AIMessageChunk with tool_call_chunks.

Result of the bug: the handler was wired but never invoked at runtime,
so adispatch_custom_event never fired and the frontend bridge stayed
dormant. Live smoke at /embed confirmed: zero a2ui-partial events on
the wire across a 622KB / 412-chunk stream.

This hotfix:
- Renames the override to on_llm_new_token with the canonical signature
  (token: str, *, chunk, run_id, parent_run_id, tags, **kwargs).
- Reads tool_call_chunks from chunk.message (ChatGenerationChunk wraps
  the AIMessageChunk in its message field).
- Gracefully handles chunk=None (legacy LLM path with no chunk object).
- Updates tests to wrap AIMessageChunk in ChatGenerationChunk and call
  the new method name. Adds a 6th test asserting chunk=None is a no-op.
blove added a commit that referenced this pull request May 12, 2026
…rk (#271)

Brings the canonical smoke checklist current with 29 PRs that landed
between Phase 7 (#239) and today without checklist updates. Specifically:

Updated sections:
- chat-debug devtools — replaced bottom-drawer model with floating
  launcher + status pill + switch (PRs #249, #251)
- Control palette — palette v2 (status pill, shadcn-styled panel, PR #244)
- Generative UI / A2UI surfaces — single-bubble invariant (PR #255),
  parent-emits-envelopes architecture (PR #259), wrapped-content +
  tool_calls coexistence (PR #255), envelope reorder
- Server-side wire format — tool_calls preserved on the final AI
- Replaced 'Multi-thread' section with 'Sidenav (thread management)'
  reflecting the permanent semantic <nav> + Active/Archived sections
  (PR #253) and removing the old palette-toggled drawer model

Added sections:
- Cmd+K history search — palette open/search/select/close, archived
  result subtitle, keyboard navigation (PR #253)
- Per-row thread actions — kebab menu order per state (active, pinned,
  archived), rename + pin/unpin + archive/unarchive + delete flows
  (PRs #258, #260, #267)
- Thread titles — first-user-message derivation, idempotent writes,
  manual rename precedence (PR #242)
- Progressive A2UI streaming — per-component fallback transition
  observable during streaming window (PRs #252, #261, #262, #268, #269)
- Inline checkpoint markers — render between messages during multi-step
  runs (PR #243)
- Responsive sidenav — viewport breakpoints, auto-collapse behavior (PR #240)

Total: ~58 new check items across 6 new sections, plus rewrites to 5
existing sections. Original 333-line checklist → 391 lines / 237 check
items.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant