Skip to content

[DX-972] Token streaming#178

Draft
umair-ably wants to merge 5 commits intomainfrom
tokenStreaming
Draft

[DX-972] Token streaming#178
umair-ably wants to merge 5 commits intomainfrom
tokenStreaming

Conversation

@umair-ably
Copy link
Contributor

@umair-ably umair-ably commented Mar 19, 2026

Summary by CodeRabbit

  • New Features

    • Added token-streaming support for ably channels publish with new flags: --token-streaming, --token-size (default 4), and --stream-duration (default 10) to publish chunked messages.
    • Added --token-streaming flag to ably channels subscribe for in-place terminal display of message appends.
  • Documentation

    • Updated README with token-streaming examples and error hint guidelines.
    • Improved error hint messages for Ably error codes.

Reduce SDK log handler to only surface messages in --verbose mode —
most SDK errors already propagate via promise rejections or state
changes handled by fail()/setupChannelStateLogging(). Skip
closing/closed connection states (CLI-initiated teardown). Strip
hint newlines in JSON output. Migrate hint strings to single quotes
for clean JSON. Add error hint for 93002 (mutableMessages).
Add --token-streaming flag that publishes an initial message then
streams remaining text as appendMessage() calls over --stream-duration
seconds. Includes text chunker utility for word-boundary-aware splitting
and --token-size flag to control chunk granularity. Refactors
publishWithRealtime to use setup*StateLogging helpers.
Add --token-streaming flag that accumulates message.append data for
the same serial, displaying the growing response in-place on TTY
terminals. Refactors message display into displayNormalMessage() and
handleTokenStreamMessage() methods. Renames message 'event' field to
'name' in output helpers to align with SDK terminology.
Document error hint formatting conventions (single quotes, newline
control) in AGENTS.md and skills. Update README with new flags and
examples. Add text-chunker to Project-Structure.md.
@vercel
Copy link

vercel bot commented Mar 19, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
cli-web-cli Ready Ready Preview, Comment Mar 19, 2026 9:20pm

Request Review

@coderabbitai
Copy link

coderabbitai bot commented Mar 19, 2026

Important

Review skipped

Auto incremental reviews are disabled on this repository.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 8eac65c5-d027-46af-9135-4bc3d2cbea81

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review

Walkthrough

This PR introduces token-streaming support for publishing and subscribing to Ably messages, enabling chunked transmission of large message data. It adds a text-chunking utility, updates error hint documentation and handling, refactors message display fields to use optional name instead of required event, and includes corresponding test coverage and mock updates.

Changes

Cohort / File(s) Summary
Documentation & Skills
.claude/skills/ably-codebase-review/SKILL.md, .claude/skills/ably-new-command/references/patterns.md, .claude/skills/ably-review/SKILL.md, AGENTS.md, docs/Project-Structure.md
Added error-hint verification steps to skill checklists; documented getFriendlyAblyErrorHint mechanism in AGENTS.md with formatting requirements for \n line breaks and single-quoted CLI commands; updated Project-Structure to reference new utilities.
README Updates
README.md
Updated Node.js version example; added token-streaming flags (--token-streaming, --token-size, --stream-duration) to channels publish and --token-streaming to channels subscribe documentation with new usage examples.
Error Hints
src/utils/errors.ts
Updated quote style in existing hint entries; added new hint for error code 93002 documenting mutableMessages requirement and related ably apps rules commands.
Message Display
src/utils/output.ts, src/commands/channels/history.ts
Changed MessageDisplayFields interface from required event: string to optional name?: string; updated history.ts to conditionally include name only when message.name exists.
Token Streaming Implementation
src/commands/channels/publish.ts, src/commands/channels/subscribe.ts
Added --token-streaming flag, new publishTokenStream() method with chunked publish/append logic for publish command; added streaming display mode accumulating message.append data with in-place terminal updates for subscribe command.
Infrastructure
src/base-command.ts, src/utils/text-chunker.ts
Simplified SDK log handling in base-command; modified JSON error output to replace newlines in hints with spaces; added new chunkText() utility function for intelligent text splitting at word boundaries.
Test & Mock Support
test/helpers/mock-ably-realtime.ts, test/unit/commands/channels/history.test.ts, test/unit/commands/channels/publish.test.ts, test/unit/commands/channels/subscribe.test.ts, test/unit/utils/text-chunker.test.ts
Added appendMessage mock to channel interface; updated history test for Name vs Event label; added comprehensive token-streaming test suite for publish and subscribe; added text-chunker behavior validation tests.

Sequence Diagrams

sequenceDiagram
    participant User as User/CLI
    participant Publish as channels:publish
    participant Chunker as chunkText()
    participant Channel as Ably Channel
    participant Output as Output Handler

    User->>Publish: ably channels publish --token-streaming
    Publish->>Publish: prepareMessageFromInput()
    Publish->>Chunker: chunkText(messageData, tokenSize)
    Chunker-->>Publish: string[]

    Publish->>Channel: channel.publish(first chunk)
    Channel-->>Publish: { versionSerial }
    Output->>Output: emit create event
    
    loop for each remaining chunk
        Publish->>Channel: channel.appendMessage(serial, chunk, delay)
        Channel-->>Publish: { versionSerial }
        Output->>Output: emit append event
    end
    
    Publish->>Output: emit stream summary
    Output-->>User: final result
Loading
sequenceDiagram
    participant User as User/CLI
    participant Subscribe as channels:subscribe
    participant Channel as Ably Channel
    participant Handler as Message Handler
    participant Display as Display/Output

    User->>Subscribe: ably channels subscribe --token-streaming
    
    Channel->>Handler: message.create (serial=1)
    Handler->>Display: output "Creating token stream"
    
    Channel->>Handler: message.append (serial=1)
    Handler->>Handler: accumulate data
    Display->>Display: update terminal in-place
    
    Channel->>Handler: message.append (serial=1)
    Handler->>Handler: accumulate + append count
    Display->>Display: refresh with appended text
    
    Channel->>Handler: message.create (serial=2)
    Handler->>Handler: finalize previous stream
    Handler->>Display: output summary "2 appends"
    Handler->>Handler: reset accumulation
    Display->>User: stream data + new message
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related PRs

Suggested reviewers

  • sacOO7

Poem

🐰 Tokens streamed in chunks so neat,
Large messages find retreat,
With \n breaks and smart word seams,
Publishing fulfills our streams,
Hints and names align their goals,
As wisdom fills our feature's bowls! 🥕✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'DX-972 Token streaming' is directly related to the main feature added in this PR. The changeset implements a token-streaming mode for publish and subscribe commands with supporting infrastructure (text chunking, hint handling), matching the title's focus.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch tokenStreaming
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@umair-ably umair-ably requested a review from Copilot March 19, 2026 20:41
@umair-ably
Copy link
Contributor Author

@coderabbitai review

@coderabbitai
Copy link

coderabbitai bot commented Mar 19, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds “token streaming” support to the channels publish/subscribe commands, enabling message-per-response style streaming via message.create + message.append, with supporting utilities, tests, and documentation updates.

Changes:

  • Introduced --token-streaming mode for channels:publish (stream outgoing text as initial publish + appends) and channels:subscribe (accumulate and display appends per serial).
  • Added chunkText() utility (and unit tests) to split text into token-like chunks.
  • Updated message output naming (EventName) and extended friendly error hint mappings (incl. mutable messages hint), with associated doc/test updates.

Reviewed changes

Copilot reviewed 18 out of 18 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
src/commands/channels/publish.ts Implements token streaming publish path (--token-streaming, --token-size, --stream-duration) using initial publish + appendMessage() calls.
src/commands/channels/subscribe.ts Adds token streaming subscribe mode that accumulates message.append data by serial and supports in-place terminal updates.
src/utils/text-chunker.ts New helper to chunk text into ~N-character “tokens” with word-boundary preference.
src/utils/output.ts Switches message display field from event to optional name and updates formatted output accordingly.
src/utils/errors.ts Normalizes hint quoting and adds a new friendly hint for error code 93002 (mutable messages requirement).
src/base-command.ts Adjusts SDK log handler behavior (now verbose-only) and tweaks connection state logging + JSON hint newline stripping.
src/commands/channels/history.ts Updates history output to use name instead of event.
test/unit/utils/text-chunker.test.ts Adds unit coverage for chunkText().
test/unit/commands/channels/publish.test.ts Adds unit coverage for token streaming publish behavior, JSON output, and failure cases.
test/unit/commands/channels/subscribe.test.ts Adds unit coverage for token streaming subscribe accumulation/display + JSON shape changes; updates label expectations.
test/unit/commands/channels/history.test.ts Updates expected output label (Name:).
test/helpers/mock-ably-realtime.ts Extends mock channel with appendMessage() for streaming tests.
README.md Documents new flags/usage/examples for token streaming publish/subscribe.
docs/Project-Structure.md Documents new text-chunker.ts utility and updated errors util description.
AGENTS.md Documents error hint formatting rules and JSON newline stripping behavior.
.claude/skills/ably-review/SKILL.md Adds review checklist items for error hint formatting conventions.
.claude/skills/ably-new-command/references/patterns.md Adds documented conventions for error hint formatting.
.claude/skills/ably-codebase-review/SKILL.md Adds review checklist items for error hints and JSON newline stripping verification.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (3)
test/unit/commands/channels/publish.test.ts (1)

547-568: Avoid real multi-second waits in the unit suite.

This test hard-codes a ~2s wall-clock delay into test:unit, which will slow the suite and make timing assertions noisier under CI load. Consider stubbing the stream-delay path or injecting a scheduler so the behavior is asserted without real sleeps.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/unit/commands/channels/publish.test.ts` around lines 547 - 568, The test
currently performs a real multi-second wait when invoking runCommand for the
"channels:publish" flow (the token-streaming path mocked by
getMockAblyRealtime), which slows CI; replace the real sleep by stubbing or
injecting a fake scheduler into the token streaming path used by
channels:publish so the stream-duration behavior is simulated instantly.
Concretely, in the test setup mock the module/function that implements the
streaming delay (the token-stream or delay helper used by the CLI command
invoked via runCommand) to advance time instantly or use a fake timer (e.g.,
sinon.useFakeTimers or Jest timers) so you can assert that runCommand triggers
the streaming logic without waiting real seconds, then restore timers/mocks
after the test. Ensure you target the stream-delay implementation referenced by
the channels:publish command and keep getMockAblyRealtime as-is.
test/unit/commands/channels/subscribe.test.ts (1)

347-359: Minor: Typo in test message ID.

The message ID "msg-nosrial" appears to be a typo for "msg-noserial". This doesn't affect test functionality but could cause confusion when debugging.

✏️ Fix typo
       mockSubscribeCallback!({
         name: "test-event",
         data: "no-serial-message",
         timestamp: Date.now(),
-        id: "msg-nosrial",
+        id: "msg-noserial",
       });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/unit/commands/channels/subscribe.test.ts` around lines 347 - 359, Fix
the typo in the test message ID used when invoking mockSubscribeCallback: change
the id value from "msg-nosrial" to "msg-noserial" so it matches the intended
naming and avoids confusion when debugging; update the literal in the
mockSubscribeCallback call in the test that awaits commandPromise (the block
referencing mockSubscribeCallback, commandPromise, and assertions checking
stdout).
src/commands/channels/publish.ts (1)

346-346: Consider validating that data is string-coercible for streaming.

If message.data is a complex object (e.g., from JSON input like {"data":{"nested":"value"}}), String() will produce "[object Object]", which is likely not the intended behavior for token streaming.

Since token streaming is designed for text (LLM responses, etc.), consider adding validation or documentation that clarifies --token-streaming expects text data:

🛡️ Optional: Add validation for string data
       const text = String(message.data ?? args.message);
+      if (typeof message.data === 'object' && message.data !== null) {
+        this.fail(
+          'Token streaming requires string data. Object data cannot be streamed as tokens.',
+          flags as BaseFlags,
+          'channelPublish',
+        );
+      }
       const tokenSize = flags["token-size"] as number;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/commands/channels/publish.ts` at line 346, The current coercion const
text = String(message.data ?? args.message) can turn objects into "[object
Object]" which breaks token streaming; update the publish flow (around variables
message.data, args.message and the token-streaming handling) to validate that
message.data is text-compatible when --token-streaming is enabled: if
message.data is a string use it, if it is JSON-like decide whether to
JSON.stringify it (and document this behavior) or reject with a clear error, and
also allow Buffer/Uint8Array handling if appropriate; ensure you emit a helpful
validation error referencing token-streaming when non-text payloads are
provided.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.claude/skills/ably-review/SKILL.md:
- Line 166: The current check simply greps for a double-quote character and thus
false-positives; change the logic to parse the TypeScript/JSON AST and inspect
string literal values for hint text instead of matching raw `"` characters.
Locate the quote-check routine (e.g., the function handling "hint strings" / the
"quote check" validator) and replace the grep with AST traversal (TS parser or
JSON.parse) to read each hint's string content and then assert CLI references
use single quotes; only flag cases where the parsed string actually contains a
double quote inside the hint text.

In `@src/base-command.ts`:
- Around line 922-939: The SDK logHandler currently only emits logs when
flags.verbose && level <= 2 which hides errors; change options.logHandler (the
function assigned to options.logHandler) to always surface error-level messages
(level 0–1) by calling this.logCliEvent("ablySdk", ...) for those levels
regardless of flags.verbose, and keep the existing verbose guard for
lower-priority logs (e.g., info/debug level 2) so that only non-error messages
remain gated by flags.verbose; ensure you still pass the same arguments (flags,
"ablySdk", `LogLevel-${level}`, message, logData).

In `@src/commands/channels/subscribe.ts`:
- Around line 87-89: token streaming state (tokenStreamSerial,
tokenStreamAccumulatedData, tokenStreamAppendCount) is currently global and
never initialized on an append, causing interleaved channels to stomp each other
and finalizeTokenStream() to be a no-op; change the implementation to key the
state by channel+serial (e.g., a Map<string, {serial, accumulatedData,
appendCount}>) or alternatively reject --token-streaming when more than one
channel is subscribed, ensure you create/initialize the slot both when handling
message.create and when handling message.append (set the serial when the first
append for that stream is seen), and update finalizeTokenStream() and any
append/create handlers to read/write the keyed state; apply the same keyed-state
fix to the other token-streaming block referenced (lines ~292-345) so each
channel+serial is tracked independently.

---

Nitpick comments:
In `@src/commands/channels/publish.ts`:
- Line 346: The current coercion const text = String(message.data ??
args.message) can turn objects into "[object Object]" which breaks token
streaming; update the publish flow (around variables message.data, args.message
and the token-streaming handling) to validate that message.data is
text-compatible when --token-streaming is enabled: if message.data is a string
use it, if it is JSON-like decide whether to JSON.stringify it (and document
this behavior) or reject with a clear error, and also allow Buffer/Uint8Array
handling if appropriate; ensure you emit a helpful validation error referencing
token-streaming when non-text payloads are provided.

In `@test/unit/commands/channels/publish.test.ts`:
- Around line 547-568: The test currently performs a real multi-second wait when
invoking runCommand for the "channels:publish" flow (the token-streaming path
mocked by getMockAblyRealtime), which slows CI; replace the real sleep by
stubbing or injecting a fake scheduler into the token streaming path used by
channels:publish so the stream-duration behavior is simulated instantly.
Concretely, in the test setup mock the module/function that implements the
streaming delay (the token-stream or delay helper used by the CLI command
invoked via runCommand) to advance time instantly or use a fake timer (e.g.,
sinon.useFakeTimers or Jest timers) so you can assert that runCommand triggers
the streaming logic without waiting real seconds, then restore timers/mocks
after the test. Ensure you target the stream-delay implementation referenced by
the channels:publish command and keep getMockAblyRealtime as-is.

In `@test/unit/commands/channels/subscribe.test.ts`:
- Around line 347-359: Fix the typo in the test message ID used when invoking
mockSubscribeCallback: change the id value from "msg-nosrial" to "msg-noserial"
so it matches the intended naming and avoids confusion when debugging; update
the literal in the mockSubscribeCallback call in the test that awaits
commandPromise (the block referencing mockSubscribeCallback, commandPromise, and
assertions checking stdout).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 8378d54f-7124-478e-83e9-b9f8d19c4571

📥 Commits

Reviewing files that changed from the base of the PR and between 5692f63 and 971c3b7.

📒 Files selected for processing (18)
  • .claude/skills/ably-codebase-review/SKILL.md
  • .claude/skills/ably-new-command/references/patterns.md
  • .claude/skills/ably-review/SKILL.md
  • AGENTS.md
  • README.md
  • docs/Project-Structure.md
  • src/base-command.ts
  • src/commands/channels/history.ts
  • src/commands/channels/publish.ts
  • src/commands/channels/subscribe.ts
  • src/utils/errors.ts
  • src/utils/output.ts
  • src/utils/text-chunker.ts
  • test/helpers/mock-ably-realtime.ts
  • test/unit/commands/channels/history.test.ts
  • test/unit/commands/channels/publish.test.ts
  • test/unit/commands/channels/subscribe.test.ts
  • test/unit/utils/text-chunker.test.ts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

2 participants