feat(client, server): support buffered mode in batch plugin#584
feat(client, server): support buffered mode in batch plugin#584
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
|
Warning Rate limit exceeded@unnoq has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 2 minutes and 46 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (3)
WalkthroughThis change introduces a new "buffered" mode for batch request/response handling, alongside the existing "streaming" mode. It updates client and server plugins, batch response utilities, and related tests to support and validate both modes. Documentation and navigation are revised to reflect the new naming and functionality. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant BatchLinkPlugin
participant Server
participant BatchHandlerPlugin
participant BatchResponseUtil
Client->>BatchLinkPlugin: Send batched requests (mode: streaming/buffered)
BatchLinkPlugin->>Server: POST /batch with x-orpc-batch header
Server->>BatchHandlerPlugin: Intercept batch request
BatchHandlerPlugin->>BatchResponseUtil: toBatchResponse({ mode })
BatchResponseUtil-->>BatchHandlerPlugin: Streaming or buffered response
BatchHandlerPlugin->>Server: Return batch response
Server->>BatchLinkPlugin: Batch response (streaming or buffered)
BatchLinkPlugin->>Client: Deliver individual responses
Possibly related PRs
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
Codecov ReportAll modified and coverable lines are covered by tests ✅ 📢 Thoughts on this report? Let us know! |
More templates
@orpc/arktype
@orpc/client
@orpc/contract
@orpc/hey-api
@orpc/nest
@orpc/openapi
@orpc/openapi-client
@orpc/react
@orpc/react-query
@orpc/server
@orpc/shared
@orpc/solid-query
@orpc/standard-server
@orpc/standard-server-aws-lambda
@orpc/standard-server-fetch
@orpc/standard-server-node
@orpc/standard-server-peer
@orpc/svelte-query
@orpc/tanstack-query
@orpc/valibot
@orpc/vue-colada
@orpc/vue-query
@orpc/zod
commit: |
There was a problem hiding this comment.
Actionable comments posted: 1
🔭 Outside diff range comments (2)
packages/client/src/plugins/batch.ts (1)
142-144: 💡 Verification agent🧩 Analysis chain
Potential compatibility issue with header cleanup.
The header cleanup logic still checks for
'x-orpc-batch': '1', but the new implementation sets this header to either'streaming'or'buffered'. This means requests with the new header values won't be cleaned up by this interceptor.
🏁 Script executed:
#!/bin/bash # Description: Check if there are other places where x-orpc-batch header cleanup is needed # Expected: Find if the header cleanup logic needs to be updated for new header values echo "Checking for x-orpc-batch header cleanup patterns..." rg "x-orpc-batch.*undefined" --type ts --type js -A 3 -B 3 echo -e "\nSearching for header removal logic..." rg "headers.*x-orpc-batch" --type ts --type js -A 5 -B 2Length of output: 2343
Update
x-orpc-batchcheck to cover new modesThe client interceptor currently only bails out when
options.request.headers['x-orpc-batch'] !== '1'but we now set this header to
"streaming"or"buffered". as a result, those requests neither enter the batching logic nor get cleaned up.– File: packages/client/src/plugins/batch.ts
– Around lines 142–144Suggested diff:
- if (options.request.headers['x-orpc-batch'] !== '1') { - return options.next() - } + const batchMode = options.request.headers['x-orpc-batch'] + // only skip if header is entirely absent + if (batchMode === undefined) { + return options.next() + }Or, if you want to explicitly enumerate all supported modes:
- if (options.request.headers['x-orpc-batch'] !== '1') { + const h = options.request.headers['x-orpc-batch'] + if (!['1', 'streaming', 'buffered'].includes(h)) { return options.next() }This change will ensure that streaming/buffered batch requests flow through the interceptor and are cleaned up correctly.
packages/standard-server/src/batch/response.ts (1)
58-72: 🛠️ Refactor suggestionEliminate code duplication by using the helper function.
The inline minification logic duplicates the
minifyResponseItemfunction defined above.Apply this diff to use the helper function:
body: (async function* () { try { for await (const item of options.body) { - yield { - index: item.index, - status: item.status === options.status ? undefined : item.status, - headers: Object.keys(item.headers).length ? item.headers : undefined, - body: item.body, - } satisfies Partial<BatchResponseBodyItem> + yield minifyResponseItem(item) } } finally { await options.body.return?.() } })(),
🧹 Nitpick comments (2)
apps/content/docs/plugins/batch-requests.md (1)
59-81: Excellent addition of batch mode documentation with minor punctuation fixes needed.The new "Batch Mode" section provides valuable guidance on choosing between streaming and buffered modes. The example code demonstrates practical usage for different environments.
Fix the punctuation issues in line 63:
-If your environment does not support streaming responses such as some serverless platforms or older browsers you can switch to `buffered` mode. In this mode, all responses are collected before being sent together. +If your environment does not support streaming responses, such as some serverless platforms or older browsers, you can switch to `buffered` mode. In this mode, all responses are collected before being sent together.🧰 Tools
🪛 LanguageTool
[uncategorized] ~63-~63: A comma might be missing here.
Context: ... environment does not support streaming responses such as some serverless platforms or ol...(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)
[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch tobufferedmode. In t...(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)
packages/standard-server/src/batch/response.ts (1)
89-94: Remove redundant type assertion.The
as numberassertion is unnecessary since the type check already ensuresitem.indexis a number.yield { - index: item.index as number, + index: item.index, status: item.status as undefined | number ?? response.status, headers: item.headers as undefined | StandardHeaders ?? {}, body: item.body, } satisfies BatchResponseBodyItem
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (9)
apps/content/.vitepress/config.ts(1 hunks)apps/content/docs/comparison.md(1 hunks)apps/content/docs/plugins/batch-requests.md(2 hunks)packages/client/src/plugins/batch.test.ts(2 hunks)packages/client/src/plugins/batch.ts(5 hunks)packages/server/src/plugins/batch.test.ts(15 hunks)packages/server/src/plugins/batch.ts(4 hunks)packages/standard-server/src/batch/response.test.ts(3 hunks)packages/standard-server/src/batch/response.ts(4 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (4)
packages/server/src/plugins/batch.test.ts (4)
packages/standard-server-peer/src/client.ts (1)
request(54-100)packages/standard-server/src/batch/response.ts (1)
parseBatchResponse(76-108)packages/standard-server/src/batch/request.ts (1)
toBatchRequest(12-38)packages/server/src/builder.ts (1)
handler(273-280)
packages/standard-server/src/batch/response.test.ts (2)
packages/standard-server/src/batch/response.ts (3)
BatchResponseBodyItem(5-7)toBatchResponse(18-74)parseBatchResponse(76-108)packages/shared/src/iterator.ts (1)
isAsyncIteratorObject(4-10)
packages/client/src/plugins/batch.ts (2)
packages/shared/src/value.ts (2)
Value(1-1)value(3-12)packages/client/src/adapters/standard/link.ts (1)
StandardLinkClientInterceptorOptions(14-16)
packages/standard-server/src/batch/response.ts (5)
packages/shared/src/index.ts (1)
Promisable(15-15)packages/standard-server/src/types.ts (2)
StandardResponse(34-41)StandardHeaders(1-3)packages/shared/src/iterator.ts (1)
isAsyncIteratorObject(4-10)packages/shared/src/object.ts (1)
isObject(31-39)packages/standard-server-peer/src/server.ts (1)
response(81-109)
🪛 LanguageTool
apps/content/docs/plugins/batch-requests.md
[uncategorized] ~63-~63: A comma might be missing here.
Context: ... environment does not support streaming responses such as some serverless platforms or ol...
(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)
[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch to buffered mode. In t...
(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)
🪛 Biome (1.9.4)
packages/standard-server/src/batch/response.ts
[error] 32-52: Promise executor functions should not be async.
(lint/suspicious/noAsyncPromiseExecutor)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: publish-commit
- GitHub Check: lint
🔇 Additional comments (19)
apps/content/docs/comparison.md (1)
37-37: LGTM! Documentation link updated correctly.The feature name change from "Batch Request/Response" to "Batch Requests" is properly reflected, and the documentation link has been updated to match the new path structure.
apps/content/.vitepress/config.ts (1)
117-117: LGTM! Sidebar navigation updated consistently.The navigation text and link have been properly updated to reflect the feature renaming, ensuring consistency across the documentation site.
apps/content/docs/plugins/batch-requests.md (1)
2-2: LGTM! Documentation title and description updated appropriately.The renaming from "Batch Request/Response Plugin" to "Batch Requests Plugin" is consistent, and the simplified description accurately reflects the plugin's functionality.
Also applies to: 6-6, 8-8
packages/standard-server/src/batch/response.test.ts (4)
5-5: Excellent use of parameterized testing for mode coverage.Using
describe.eachto test both 'streaming' and 'buffered' modes ensures comprehensive test coverage and reduces code duplication.
10-11: Correct handling of async toBatchResponse calls.Awaiting the
toBatchResponsecall is the right approach since it can return either aStandardResponsedirectly (streaming mode) or aPromise<StandardResponse>(buffered mode).
33-57: Valuable addition of mode-specific tests.The new test block provides essential coverage for:
- Default streaming mode behavior
- Error propagation in buffered mode
These tests ensure the function behaves correctly in both success and error scenarios.
70-72: Comprehensive testing of invalid batch handling across modes.Extending the invalid batch body tests to cover both modes ensures that error handling works consistently regardless of the batch mode.
packages/server/src/plugins/batch.test.ts (2)
42-42: LGTM! Header format updated correctly.The batch header is properly updated from the legacy
'1'format to the new descriptive'streaming'mode.
168-303: Excellent test coverage for buffered mode!The new test comprehensively covers the buffered mode functionality with proper assertions for:
- Correct header propagation (
'x-orpc-batch': 'buffered')- Interceptor call verification
- Response parsing and ordering
- Status and header validation
The test structure mirrors the existing streaming mode test, ensuring consistency.
packages/server/src/plugins/batch.ts (2)
137-140: LGTM! Mode-aware batch response generation.The implementation correctly:
- Makes the
toBatchResponsecall awaitable- Dynamically determines the mode based on the header value
- Defaults to
'streaming'for any non-'buffered'valueThis provides a sensible fallback behavior while supporting both modes.
73-76:Details
✅ Verification successful
Verify the header checking logic change.
The header checking logic changed from a strict equality check (
=== '1') to usingflattenHeaderand checking forundefined. This makes the batch detection more permissive - any truthy value in the'x-orpc-batch'header will now be considered a batch request.While this aligns with supporting both
'streaming'and'buffered'values, ensure this doesn't unintentionally process non-batch requests as batch requests.
🏁 Script executed:
#!/bin/bash # Description: Search for any existing usage of x-orpc-batch header with values other than '1', 'streaming', or 'buffered' # Expected: No unexpected values should be found echo "Searching for x-orpc-batch header usage patterns..." rg -i "x-orpc-batch" --type ts --type js -A 2 -B 2 echo -e "\nChecking for any hardcoded header values..." rg "'x-orpc-batch':\s*['\"][^'\"]*['\"]" --type ts --type jsLength of output: 12038
Header detection is safe and scoped to expected values
I searched the entire repo for
x-orpc-batchand found only three hard-coded values in use—'1','streaming', and'buffered'. TheflattenHeaderutility only returnsundefinedwhen the header is absent; any present value (and we only ever set the three above in our client and tests) correctly routes through the batch logic. No other uses of this header exist, so there’s no risk of unintentional batch processing.packages/client/src/plugins/batch.test.ts (2)
25-25: Good async handling update.The
toBatchResponsecall is now properly awaited, which aligns with the server-side changes where batch response generation can be asynchronous.
118-196: Comprehensive buffered mode test coverage.The new test properly validates:
- Buffered mode configuration
- Correct header setting (
'x-orpc-batch': 'buffered')- Mock setup with mode-specific response
- Client call verification with proper parameters
The test structure is consistent with existing patterns and provides thorough coverage.
packages/client/src/plugins/batch.ts (2)
25-30: Well-designed mode option interface.The new
modeoption is properly:
- Typed as a union of valid values (
'streaming' | 'buffered')- Documented with clear default value
- Designed as a
Valuetype supporting both static and dynamic resolution
261-264: Perfect dynamic header implementation.The mode value is properly resolved and used as the batch header value. This creates a clean mapping between the configured mode and the header sent to the server.
packages/standard-server/src/batch/response.ts (4)
1-1: Type system updates look good!The addition of the
Promisabletype and updated return type correctly reflect that the function can now return either a direct response (streaming mode) or a Promise (buffered mode).Also applies to: 18-18
11-16: Well-designed mode parameter!The optional
modeparameter with a default value ensures backward compatibility while enabling the new buffered functionality.
84-86: Excellent error handling with causes!The use of error causes provides valuable debugging context, making it easier to diagnose issues in production.
Also applies to: 105-107
50-50: Proper resource cleanup implementation!Consistently calling
return()on async iterators infinallyblocks ensures resources are properly cleaned up, preventing potential memory leaks.Also applies to: 70-70, 98-101
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (1)
apps/content/docs/plugins/batch-requests.md (1)
63-63: Add missing comma for clarity in buffered descriptionA comma is needed after "older browsers" to improve readability.
- If your environment does not support streaming responses, such as some serverless platforms or older browsers you can switch to `buffered` mode. + If your environment does not support streaming responses, such as some serverless platforms or older browsers, you can switch to `buffered` mode.🧰 Tools
🪛 LanguageTool
[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch tobufferedmode. In t...(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
apps/content/docs/plugins/batch-requests.md(2 hunks)packages/standard-server/src/batch/response.test.ts(3 hunks)packages/standard-server/src/batch/response.ts(4 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/standard-server/src/batch/response.test.ts
- packages/standard-server/src/batch/response.ts
🧰 Additional context used
🪛 LanguageTool
apps/content/docs/plugins/batch-requests.md
[uncategorized] ~63-~63: A comma might be missing here.
Context: ...h as some serverless platforms or older browsers you can switch to buffered mode. In t...
(AI_EN_LECTOR_MISSING_PUNCTUATION_COMMA)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: publish-commit
- GitHub Check: lint
🔇 Additional comments (6)
apps/content/docs/plugins/batch-requests.md (6)
2-2: Approve title updateThe title has been correctly changed to "Batch Requests Plugin" to match the new naming convention.
6-6: Approve header updateThe main header "# Batch Requests Plugin" aligns with the updated title and plugin naming.
8-8: Approve introductory rephraseThe introduction now concisely describes batching and removes outdated streaming notes.
59-59: Approve addition of "Batch Mode" sectionIntroducing a "Batch Mode" section clarifies usage of both streaming and buffered modes.
61-61: Approve default streaming descriptionThe explanation of the default
streamingmode is clear and accurate.
66-80: Approve buffered mode code snippetThe example correctly shows how to configure
modedynamically based on the runtime environment.
Summary by CodeRabbit
New Features
Bug Fixes
Tests
Documentation