-
Couldn't load subscription status.
- Fork 19
👍 Add accumulate() function for automatic RPC batching with parallel execution support #297
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughIntroduces batch accumulation functionality for Denops that queues and automatically batches RPC calls within an execution context. Adds an Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant accumulate
participant AccHelper as AccumulateHelper
participant nextTick
participant denops.batch as denops.batch()
User->>accumulate: accumulate(denops, executor)
accumulate->>AccHelper: new AccumulateHelper(denops)
accumulate->>User: run executor(helper)
loop while executor schedules calls
User->>AccHelper: helper.call / batch / cmd / eval / dispatch
AccHelper->>AccHelper: enqueue Call
AccHelper->>nextTick: schedule resolve
AccHelper-->>User: Promise<Result>
end
nextTick->>AccHelper: #resolvePendingCalls
AccHelper->>denops.batch: batch(enqueued calls)
denops.batch-->>AccHelper: results[] or BatchError
alt success
AccHelper->>AccHelper: map results -> resolve Promises
AccHelper-->>User: resolved values
else batch error
AccHelper->>AccHelper: construct AccumulateCancelledError or propagate BatchError
AccHelper-->>User: reject Promises
end
User->>accumulate: executor returns
accumulate->>AccHelper: close()
accumulate-->>User: final result
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Add accumulate() function that automatically batches multiple RPC calls that occur at the same timing during microtask processing. This enables parallel RPC execution with automatic batching and proper error handling.
6d1c3c9 to
c4da175
Compare
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #297 +/- ##
==========================================
+ Coverage 84.82% 85.73% +0.91%
==========================================
Files 64 66 +2
Lines 2879 3064 +185
Branches 281 306 +25
==========================================
+ Hits 2442 2627 +185
Misses 435 435
Partials 2 2 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (3)
batch/accumulate.ts (3)
1-1: Prefer queueMicrotask for portability (Deno-first) over node:process nextTick.Using node:process couples to Node polyfills and has different ordering vs Promise microtasks. queueMicrotask keeps semantics simple and avoids Node dependency.
Apply this diff:
-import { nextTick } from "node:process"; +// Prefer runtime-agnostic microtask scheduling +const schedule = queueMicrotask; ... - nextTick(() => { + schedule(() => { if (end === this.#calls.length) { this.#resolvePendingCalls(); } });Also applies to: 139-144
19-22: Consider std/async deferred() instead of Promise.withResolvers() for broader compatibility.Some environments lag Promise.withResolvers(). std/async’s deferred is stable across Deno versions.
- readonly #disposer = Promise.withResolvers<void>(); + // Using std/async deferred for compatibility + readonly #disposer = deferred<void>(); ... - #resolvedWaiter = Promise.withResolvers<void>(); + #resolvedWaiter = deferred<void>(); ... - const { resolve } = this.#resolvedWaiter; - this.#resolvedWaiter = Promise.withResolvers(); + const { resolve } = this.#resolvedWaiter; + this.#resolvedWaiter = deferred<void>();Add import:
+import { deferred } from "@std/async/deferred";Also applies to: 160-162
84-109: Batch error plumbing is correct; minor message polish (optional).Current repr uses the first fn and total count; optional: include the index at which failure occurred for quicker triage.
- const [[fn]] = calls; - const repr = `[['${fn}', ...], ... total ${calls.length} calls]`; + const [[fn]] = calls; + const repr = `[['${fn}', ...], ... total ${calls.length} calls] (failed at index ${errorIndex})`;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
batch/accumulate.ts(1 hunks)batch/accumulate_test.ts(1 hunks)batch/error.ts(1 hunks)batch/mod.ts(2 hunks)deno.jsonc(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (3)
batch/error.ts (1)
batch/accumulate.ts (2)
calls(135-154)calls(169-191)
batch/accumulate_test.ts (3)
batch/accumulate.ts (5)
calls(135-154)calls(169-191)accumulate(254-264)batch(84-109)TypeError(127-133)batch/collect.ts (1)
collect(138-151)batch/error.ts (1)
AccumulateCancelledError(22-43)
batch/accumulate.ts (1)
batch/error.ts (1)
AccumulateCancelledError(22-43)
🔇 Additional comments (8)
deno.jsonc (1)
47-47: Add std/async alias — LGTM.Alignment with tests (delay) and future use is fine. No action needed.
batch/error.ts (1)
1-43: Error type design looks solid.Clear payload (calls) and stable name via static initializer. Good fit for cancellation signaling.
batch/mod.ts (1)
6-8: Public surface expansion and docs — LGTM.Re-exports and example are coherent with accumulate() usage.
Also applies to: 46-49
batch/accumulate.ts (2)
61-83: Error propagation from call() reads clean.Good separation of error/cancel/unknown cases; cancellation message includes call repr and carries cause/calls.
169-191: Edge-cases handled well in #resolveCalls().
- BatchError path composes results + error + cancels correctly.
- Non-BatchError path maps to “unknown” errors with cause.
batch/accumulate_test.ts (3)
320-346: Nested accumulate/batch/collect scenarios — great coverage.These steps validate interop and batching boundaries thoroughly.
Also applies to: 347-369, 370-394
425-437: Outside-scope rejection and concurrent disposal behavior — excellent.Covers disposer race, ensures no stray denops.batch calls. Nicely prevents unhandled rejections.
Also applies to: 438-466, 467-495, 496-543
1014-1021: Review comment is incorrect — code already rejects as intended.The review misunderstands how
resolvesNextfrom@std/testing/mockworks. WhenresolvesNextreceives an Error object in the iterable, it automatically throws/rejects that Error when the stubbed function is awaited.At lines 1014-1021 and 691-697,
resolvesNext<unknown[]>([underlyingError])withunderlyingError = new Error("Network error")is correct and will reject as expected. The test assertions (e.g.,assertRejects(() => actual, Error, "Network error")at line 1032) confirm the intended rejection behavior occurs.The pattern difference at line 730—using
() => Promise.reject(underlyingError)instead—exists because that test uses a non-Error value (42), whichresolvesNextwould not automatically reject.No changes are needed.
Likely an incorrect or invalid review comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How should we distinguish this from the existing ones?
Is it something that replaces them, or is it meant to be used in a different context?
If it’s something that replaces the existing ones, we don’t need to delete them, but I’d like to mark them as deprecated.
Summary
accumulate()function to thebatchmodule that automatically batches multiple RPC callsPromise.all()while maintaining automatic batchingAccumulateCancelledErrorfor proper error handling in parallel execution scenariosBackground
The existing
batch()andcollect()functions require sequential execution and manual batching. The newaccumulate()function enables more natural async/await patterns with automatic batching during microtask processing.This function was originally developed in a separate repository and has been published on jsr.io with proven usage in various Denops plugins. Given its
maturity and utility, we're now integrating it into the standard library.
Example Usage
In this example, the following 3 RPCs are called:
getlinematchstrcalls in one RPClencalls in one RPCTesting
Breaking Changes
None. This is a purely additive change.
Related
Additional Context
gather()was suggested by AI but rejected due to:versions
accumulate()in some plugin librariesSummary by CodeRabbit
New Features
Tests
Chores