Skip to content

feat(client, server): support buffered mode in batch plugin#584

Merged
dinwwwh merged 9 commits intomainfrom
feat/client-server/batch-plugin-buffered-mode
Jun 3, 2025
Merged

feat(client, server): support buffered mode in batch plugin#584
dinwwwh merged 9 commits intomainfrom
feat/client-server/batch-plugin-buffered-mode

Conversation

@dinwwwh
Copy link
Copy Markdown
Member

@dinwwwh dinwwwh commented Jun 3, 2025

Summary by CodeRabbit

  • New Features

    • Added support for configurable batch response modes: "streaming" (default) and "buffered" for environments without streaming support.
    • Documentation updated with detailed explanations and configuration examples for batch modes.
  • Bug Fixes

    • Enhanced validation and error handling for batch responses across both modes.
  • Tests

    • Extended test coverage to include buffered mode scenarios and updated existing tests for streaming mode.
  • Documentation

    • Renamed "Batch Request/Response Plugin" to "Batch Requests Plugin" and updated related links and labels for consistency.

@vercel
Copy link
Copy Markdown

vercel Bot commented Jun 3, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
orpc ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 3, 2025 9:12am

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Jun 3, 2025

Warning

Rate limit exceeded

@unnoq has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 2 minutes and 46 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between 43f13b2 and 3a0c988.

📒 Files selected for processing (3)
  • packages/client/src/plugins/batch.ts (6 hunks)
  • packages/server/src/plugins/batch.test.ts (18 hunks)
  • packages/standard-server/src/batch/response.ts (3 hunks)

Walkthrough

This change introduces a new "buffered" mode for batch request/response handling, alongside the existing "streaming" mode. It updates client and server plugins, batch response utilities, and related tests to support and validate both modes. Documentation and navigation are revised to reflect the new naming and functionality.

Changes

Files/Paths Change Summary
apps/content/.vitepress/config.ts, apps/content/docs/comparison.md, apps/content/docs/plugins/batch-requests.md Updated sidebar and documentation to rename "Batch Request/Response" to "Batch Requests" and document new "buffered" batch mode.
packages/client/src/plugins/batch.ts, packages/client/src/plugins/batch.test.ts Added batch mode option (streaming/buffered) to plugin and tests; updated batch request headers and test coverage for both modes.
packages/server/src/plugins/batch.ts, packages/server/src/plugins/batch.test.ts Added support for batch mode header, updated logic to handle both modes, and expanded tests for buffered and streaming responses.
packages/standard-server/src/batch/response.ts, packages/standard-server/src/batch/response.test.ts Refactored batch response utilities to support both streaming and buffered modes; updated tests for parameterized mode coverage.

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant BatchLinkPlugin
    participant Server
    participant BatchHandlerPlugin
    participant BatchResponseUtil

    Client->>BatchLinkPlugin: Send batched requests (mode: streaming/buffered)
    BatchLinkPlugin->>Server: POST /batch with x-orpc-batch header
    Server->>BatchHandlerPlugin: Intercept batch request
    BatchHandlerPlugin->>BatchResponseUtil: toBatchResponse({ mode })
    BatchResponseUtil-->>BatchHandlerPlugin: Streaming or buffered response
    BatchHandlerPlugin->>Server: Return batch response
    Server->>BatchLinkPlugin: Batch response (streaming or buffered)
    BatchLinkPlugin->>Client: Deliver individual responses
Loading

Possibly related PRs

Poem

🥕
A batch of requests, now streaming or stored,
Buffered or flowing, responses are scored.
Docs and tests polished, with headers anew,
The rabbit hops forward—batching for you!
Choose your own mode, let your clients decide,
In fields of concurrency, we joyfully stride.
🐇

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@codecov
Copy link
Copy Markdown

codecov Bot commented Jun 3, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

📢 Thoughts on this report? Let us know!

@pkg-pr-new
Copy link
Copy Markdown

pkg-pr-new Bot commented Jun 3, 2025

More templates

@orpc/arktype

npm i https://pkg.pr.new/@orpc/arktype@584

@orpc/client

npm i https://pkg.pr.new/@orpc/client@584

@orpc/contract

npm i https://pkg.pr.new/@orpc/contract@584

@orpc/hey-api

npm i https://pkg.pr.new/@orpc/hey-api@584

@orpc/nest

npm i https://pkg.pr.new/@orpc/nest@584

@orpc/openapi

npm i https://pkg.pr.new/@orpc/openapi@584

@orpc/openapi-client

npm i https://pkg.pr.new/@orpc/openapi-client@584

@orpc/react

npm i https://pkg.pr.new/@orpc/react@584

@orpc/react-query

npm i https://pkg.pr.new/@orpc/react-query@584

@orpc/server

npm i https://pkg.pr.new/@orpc/server@584

@orpc/shared

npm i https://pkg.pr.new/@orpc/shared@584

@orpc/solid-query

npm i https://pkg.pr.new/@orpc/solid-query@584

@orpc/standard-server

npm i https://pkg.pr.new/@orpc/standard-server@584

@orpc/standard-server-aws-lambda

npm i https://pkg.pr.new/@orpc/standard-server-aws-lambda@584

@orpc/standard-server-fetch

npm i https://pkg.pr.new/@orpc/standard-server-fetch@584

@orpc/standard-server-node

npm i https://pkg.pr.new/@orpc/standard-server-node@584

@orpc/standard-server-peer

npm i https://pkg.pr.new/@orpc/standard-server-peer@584

@orpc/svelte-query

npm i https://pkg.pr.new/@orpc/svelte-query@584

@orpc/tanstack-query

npm i https://pkg.pr.new/@orpc/tanstack-query@584

@orpc/valibot

npm i https://pkg.pr.new/@orpc/valibot@584

@orpc/vue-colada

npm i https://pkg.pr.new/@orpc/vue-colada@584

@orpc/vue-query

npm i https://pkg.pr.new/@orpc/vue-query@584

@orpc/zod

npm i https://pkg.pr.new/@orpc/zod@584

commit: 3a0c988

@dinwwwh dinwwwh marked this pull request as ready for review June 3, 2025 08:42
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🔭 Outside diff range comments (2)
packages/client/src/plugins/batch.ts (1)

142-144: 💡 Verification agent

🧩 Analysis chain

Potential compatibility issue with header cleanup.

The header cleanup logic still checks for 'x-orpc-batch': '1', but the new implementation sets this header to either 'streaming' or 'buffered'. This means requests with the new header values won't be cleaned up by this interceptor.


🏁 Script executed:

#!/bin/bash
# Description: Check if there are other places where x-orpc-batch header cleanup is needed
# Expected: Find if the header cleanup logic needs to be updated for new header values

echo "Checking for x-orpc-batch header cleanup patterns..."
rg "x-orpc-batch.*undefined" --type ts --type js -A 3 -B 3

echo -e "\nSearching for header removal logic..."
rg "headers.*x-orpc-batch" --type ts --type js -A 5 -B 2

Length of output: 2343


Update x-orpc-batch check to cover new modes

The client interceptor currently only bails out when

options.request.headers['x-orpc-batch'] !== '1'

but we now set this header to "streaming" or "buffered". as a result, those requests neither enter the batching logic nor get cleaned up.

– File: packages/client/src/plugins/batch.ts
– Around lines 142–144

Suggested diff:

- if (options.request.headers['x-orpc-batch'] !== '1') {
-   return options.next()
- }
+ const batchMode = options.request.headers['x-orpc-batch']
+ // only skip if header is entirely absent
+ if (batchMode === undefined) {
+   return options.next()
+ }

Or, if you want to explicitly enumerate all supported modes:

- if (options.request.headers['x-orpc-batch'] !== '1') {
+ const h = options.request.headers['x-orpc-batch']
+ if (!['1', 'streaming', 'buffered'].includes(h)) {
     return options.next()
   }

This change will ensure that streaming/buffered batch requests flow through the interceptor and are cleaned up correctly.

packages/standard-server/src/batch/response.ts (1)

58-72: 🛠️ Refactor suggestion

Eliminate code duplication by using the helper function.

The inline minification logic duplicates the minifyResponseItem function defined above.

Apply this diff to use the helper function:

     body: (async function* () {
       try {
         for await (const item of options.body) {
-          yield {
-            index: item.index,
-            status: item.status === options.status ? undefined : item.status,
-            headers: Object.keys(item.headers).length ? item.headers : undefined,
-            body: item.body,
-          } satisfies Partial<BatchResponseBodyItem>
+          yield minifyResponseItem(item)
         }
       }
       finally {
         await options.body.return?.()
       }
     })(),
🧹 Nitpick comments (2)
apps/content/docs/plugins/batch-requests.md (1)

59-81: Excellent addition of batch mode documentation with minor punctuation fixes needed.

The new "Batch Mode" section provides valuable guidance on choosing between streaming and buffered modes. The example code demonstrates practical usage for different environments.

Fix the punctuation issues in line 63:

-If your environment does not support streaming responses such as some serverless platforms or older browsers you can switch to `buffered` mode. In this mode, all responses are collected before being sent together.
+If your environment does not support streaming responses, such as some serverless platforms or older browsers, you can switch to `buffered` mode. In this mode, all responses are collected before being sent together.
🧰 Tools
🪛 LanguageTool

[uncategorized] ~63-~63: A comma might be missing here.
Context: ... environment does not support streaming responses such as some serverless platforms or ol...

(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)


[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch to buffered mode. In t...

(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)

packages/standard-server/src/batch/response.ts (1)

89-94: Remove redundant type assertion.

The as number assertion is unnecessary since the type check already ensures item.index is a number.

           yield {
-            index: item.index as number,
+            index: item.index,
             status: item.status as undefined | number ?? response.status,
             headers: item.headers as undefined | StandardHeaders ?? {},
             body: item.body,
           } satisfies BatchResponseBodyItem
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f16d90e and 2500898.

📒 Files selected for processing (9)
  • apps/content/.vitepress/config.ts (1 hunks)
  • apps/content/docs/comparison.md (1 hunks)
  • apps/content/docs/plugins/batch-requests.md (2 hunks)
  • packages/client/src/plugins/batch.test.ts (2 hunks)
  • packages/client/src/plugins/batch.ts (5 hunks)
  • packages/server/src/plugins/batch.test.ts (15 hunks)
  • packages/server/src/plugins/batch.ts (4 hunks)
  • packages/standard-server/src/batch/response.test.ts (3 hunks)
  • packages/standard-server/src/batch/response.ts (4 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (4)
packages/server/src/plugins/batch.test.ts (4)
packages/standard-server-peer/src/client.ts (1)
  • request (54-100)
packages/standard-server/src/batch/response.ts (1)
  • parseBatchResponse (76-108)
packages/standard-server/src/batch/request.ts (1)
  • toBatchRequest (12-38)
packages/server/src/builder.ts (1)
  • handler (273-280)
packages/standard-server/src/batch/response.test.ts (2)
packages/standard-server/src/batch/response.ts (3)
  • BatchResponseBodyItem (5-7)
  • toBatchResponse (18-74)
  • parseBatchResponse (76-108)
packages/shared/src/iterator.ts (1)
  • isAsyncIteratorObject (4-10)
packages/client/src/plugins/batch.ts (2)
packages/shared/src/value.ts (2)
  • Value (1-1)
  • value (3-12)
packages/client/src/adapters/standard/link.ts (1)
  • StandardLinkClientInterceptorOptions (14-16)
packages/standard-server/src/batch/response.ts (5)
packages/shared/src/index.ts (1)
  • Promisable (15-15)
packages/standard-server/src/types.ts (2)
  • StandardResponse (34-41)
  • StandardHeaders (1-3)
packages/shared/src/iterator.ts (1)
  • isAsyncIteratorObject (4-10)
packages/shared/src/object.ts (1)
  • isObject (31-39)
packages/standard-server-peer/src/server.ts (1)
  • response (81-109)
🪛 LanguageTool
apps/content/docs/plugins/batch-requests.md

[uncategorized] ~63-~63: A comma might be missing here.
Context: ... environment does not support streaming responses such as some serverless platforms or ol...

(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)


[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch to buffered mode. In t...

(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)

🪛 Biome (1.9.4)
packages/standard-server/src/batch/response.ts

[error] 32-52: Promise executor functions should not be async.

(lint/suspicious/noAsyncPromiseExecutor)

⏰ Context from checks skipped due to timeout of 90000ms (2)
  • GitHub Check: publish-commit
  • GitHub Check: lint
🔇 Additional comments (19)
apps/content/docs/comparison.md (1)

37-37: LGTM! Documentation link updated correctly.

The feature name change from "Batch Request/Response" to "Batch Requests" is properly reflected, and the documentation link has been updated to match the new path structure.

apps/content/.vitepress/config.ts (1)

117-117: LGTM! Sidebar navigation updated consistently.

The navigation text and link have been properly updated to reflect the feature renaming, ensuring consistency across the documentation site.

apps/content/docs/plugins/batch-requests.md (1)

2-2: LGTM! Documentation title and description updated appropriately.

The renaming from "Batch Request/Response Plugin" to "Batch Requests Plugin" is consistent, and the simplified description accurately reflects the plugin's functionality.

Also applies to: 6-6, 8-8

packages/standard-server/src/batch/response.test.ts (4)

5-5: Excellent use of parameterized testing for mode coverage.

Using describe.each to test both 'streaming' and 'buffered' modes ensures comprehensive test coverage and reduces code duplication.


10-11: Correct handling of async toBatchResponse calls.

Awaiting the toBatchResponse call is the right approach since it can return either a StandardResponse directly (streaming mode) or a Promise<StandardResponse> (buffered mode).


33-57: Valuable addition of mode-specific tests.

The new test block provides essential coverage for:

  1. Default streaming mode behavior
  2. Error propagation in buffered mode

These tests ensure the function behaves correctly in both success and error scenarios.


70-72: Comprehensive testing of invalid batch handling across modes.

Extending the invalid batch body tests to cover both modes ensures that error handling works consistently regardless of the batch mode.

packages/server/src/plugins/batch.test.ts (2)

42-42: LGTM! Header format updated correctly.

The batch header is properly updated from the legacy '1' format to the new descriptive 'streaming' mode.


168-303: Excellent test coverage for buffered mode!

The new test comprehensively covers the buffered mode functionality with proper assertions for:

  • Correct header propagation ('x-orpc-batch': 'buffered')
  • Interceptor call verification
  • Response parsing and ordering
  • Status and header validation

The test structure mirrors the existing streaming mode test, ensuring consistency.

packages/server/src/plugins/batch.ts (2)

137-140: LGTM! Mode-aware batch response generation.

The implementation correctly:

  • Makes the toBatchResponse call awaitable
  • Dynamically determines the mode based on the header value
  • Defaults to 'streaming' for any non-'buffered' value

This provides a sensible fallback behavior while supporting both modes.


73-76:

Details

✅ Verification successful

Verify the header checking logic change.

The header checking logic changed from a strict equality check (=== '1') to using flattenHeader and checking for undefined. This makes the batch detection more permissive - any truthy value in the 'x-orpc-batch' header will now be considered a batch request.

While this aligns with supporting both 'streaming' and 'buffered' values, ensure this doesn't unintentionally process non-batch requests as batch requests.


🏁 Script executed:

#!/bin/bash
# Description: Search for any existing usage of x-orpc-batch header with values other than '1', 'streaming', or 'buffered'
# Expected: No unexpected values should be found

echo "Searching for x-orpc-batch header usage patterns..."
rg -i "x-orpc-batch" --type ts --type js -A 2 -B 2

echo -e "\nChecking for any hardcoded header values..."
rg "'x-orpc-batch':\s*['\"][^'\"]*['\"]" --type ts --type js

Length of output: 12038


Header detection is safe and scoped to expected values

I searched the entire repo for x-orpc-batch and found only three hard-coded values in use—'1', 'streaming', and 'buffered'. The flattenHeader utility only returns undefined when the header is absent; any present value (and we only ever set the three above in our client and tests) correctly routes through the batch logic. No other uses of this header exist, so there’s no risk of unintentional batch processing.

packages/client/src/plugins/batch.test.ts (2)

25-25: Good async handling update.

The toBatchResponse call is now properly awaited, which aligns with the server-side changes where batch response generation can be asynchronous.


118-196: Comprehensive buffered mode test coverage.

The new test properly validates:

  • Buffered mode configuration
  • Correct header setting ('x-orpc-batch': 'buffered')
  • Mock setup with mode-specific response
  • Client call verification with proper parameters

The test structure is consistent with existing patterns and provides thorough coverage.

packages/client/src/plugins/batch.ts (2)

25-30: Well-designed mode option interface.

The new mode option is properly:

  • Typed as a union of valid values ('streaming' | 'buffered')
  • Documented with clear default value
  • Designed as a Value type supporting both static and dynamic resolution

261-264: Perfect dynamic header implementation.

The mode value is properly resolved and used as the batch header value. This creates a clean mapping between the configured mode and the header sent to the server.

packages/standard-server/src/batch/response.ts (4)

1-1: Type system updates look good!

The addition of the Promisable type and updated return type correctly reflect that the function can now return either a direct response (streaming mode) or a Promise (buffered mode).

Also applies to: 18-18


11-16: Well-designed mode parameter!

The optional mode parameter with a default value ensures backward compatibility while enabling the new buffered functionality.


84-86: Excellent error handling with causes!

The use of error causes provides valuable debugging context, making it easier to diagnose issues in production.

Also applies to: 105-107


50-50: Proper resource cleanup implementation!

Consistently calling return() on async iterators in finally blocks ensures resources are properly cleaned up, preventing potential memory leaks.

Also applies to: 70-70, 98-101

Comment thread packages/standard-server/src/batch/response.ts
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
apps/content/docs/plugins/batch-requests.md (1)

63-63: Add missing comma for clarity in buffered description

A comma is needed after "older browsers" to improve readability.

- If your environment does not support streaming responses, such as some serverless platforms or older browsers you can switch to `buffered` mode.
+ If your environment does not support streaming responses, such as some serverless platforms or older browsers, you can switch to `buffered` mode.
🧰 Tools
🪛 LanguageTool

[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch to buffered mode. In t...

(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2500898 and 8d13257.

📒 Files selected for processing (3)
  • apps/content/docs/plugins/batch-requests.md (2 hunks)
  • packages/standard-server/src/batch/response.test.ts (3 hunks)
  • packages/standard-server/src/batch/response.ts (4 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • packages/standard-server/src/batch/response.test.ts
  • packages/standard-server/src/batch/response.ts
🧰 Additional context used
🪛 LanguageTool
apps/content/docs/plugins/batch-requests.md

[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch to buffered mode. In t...

(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)

⏰ Context from checks skipped due to timeout of 90000ms (2)
  • GitHub Check: publish-commit
  • GitHub Check: lint
🔇 Additional comments (6)
apps/content/docs/plugins/batch-requests.md (6)

2-2: Approve title update

The title has been correctly changed to "Batch Requests Plugin" to match the new naming convention.


6-6: Approve header update

The main header "# Batch Requests Plugin" aligns with the updated title and plugin naming.


8-8: Approve introductory rephrase

The introduction now concisely describes batching and removes outdated streaming notes.


59-59: Approve addition of "Batch Mode" section

Introducing a "Batch Mode" section clarifies usage of both streaming and buffered modes.


61-61: Approve default streaming description

The explanation of the default streaming mode is clear and accurate.


66-80: Approve buffered mode code snippet

The example correctly shows how to configure mode dynamically based on the runtime environment.

@dinwwwh dinwwwh merged commit f107a0e into main Jun 3, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant