Skip to content

Paginate writeToStreamMulti to respect server's chunk limit#1626

Merged
VaguelySerious merged 3 commits intomainfrom
peter/max-chunks-per-flush
Apr 6, 2026
Merged

Paginate writeToStreamMulti to respect server's chunk limit#1626
VaguelySerious merged 3 commits intomainfrom
peter/max-chunks-per-flush

Conversation

@VaguelySerious
Copy link
Copy Markdown
Member

@VaguelySerious VaguelySerious commented Apr 6, 2026

Summary

  • Adds MAX_CHUNKS_PER_REQUEST = 1000 in world-vercel, matching the server's MAX_CHUNKS_PER_BATCH
  • writeToStreamMulti now paginates into multiple HTTP requests when the chunk count exceeds this limit
  • Keeps the fix in the transport layer (world-vercel) where the server constraint applies, rather than core

Test plan

  • New tests: single request under limit, pagination into 2 requests, pagination into 3 requests
  • All existing encodeMultiChunks tests pass (13/13)
  • Core WorkflowServerWritableStream tests unaffected (18/18)
  • Biome lint clean
  • E2E tests in CI

🤖 Generated with Claude Code

Prevent 400 errors when users write more than 1000 chunks within a
flush interval. The server rejects batches exceeding MAX_CHUNKS_PER_BATCH
(1000 chunks).

Two safety mechanisms:
- scheduleFlush triggers immediate flush when buffer hits the limit
- flush splits oversized buffers into sub-batches of MAX_CHUNKS_PER_FLUSH

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@VaguelySerious VaguelySerious requested a review from a team as a code owner April 6, 2026 22:28
@vercel
Copy link
Copy Markdown
Contributor

vercel bot commented Apr 6, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
example-nextjs-workflow-turbopack Ready Ready Preview, Comment Apr 6, 2026 10:55pm
example-nextjs-workflow-webpack Ready Ready Preview, Comment Apr 6, 2026 10:55pm
example-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-astro-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-express-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-fastify-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-hono-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-nitro-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-nuxt-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-sveltekit-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workbench-vite-workflow Ready Ready Preview, Comment Apr 6, 2026 10:55pm
workflow-docs Ready Ready Preview, Comment, Open in v0 Apr 6, 2026 10:55pm
workflow-swc-playground Ready Ready Preview, Comment Apr 6, 2026 10:55pm

@changeset-bot
Copy link
Copy Markdown

changeset-bot bot commented Apr 6, 2026

🦋 Changeset detected

Latest commit: 054e325

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 17 packages
Name Type
@workflow/world-vercel Patch
@workflow/cli Patch
@workflow/core Patch
workflow Patch
@workflow/world-testing Patch
@workflow/builders Patch
@workflow/next Patch
@workflow/nitro Patch
@workflow/vitest Patch
@workflow/web-shared Patch
@workflow/ai Patch
@workflow/astro Patch
@workflow/nest Patch
@workflow/rollup Patch
@workflow/sveltekit Patch
@workflow/vite Patch
@workflow/nuxt Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 6, 2026

📊 Benchmark Results

📈 Comparing against baseline from main branch. Green 🟢 = faster, Red 🔺 = slower.

workflow with no steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 0.042s (-1.6%) 1.006s (~) 0.963s 10 1.00x
💻 Local Nitro 0.044s (+3.3%) 1.005s (~) 0.961s 10 1.05x
💻 Local Next.js (Turbopack) 0.049s 1.005s 0.957s 10 1.16x
🐘 Postgres Next.js (Turbopack) 0.054s 1.009s 0.955s 10 1.28x
🌐 Redis Next.js (Turbopack) 0.054s 1.006s 0.951s 10 1.29x
🐘 Postgres Nitro 0.057s (-2.2%) 1.009s (~) 0.951s 10 1.36x
🐘 Postgres Express 0.061s (+0.7%) 1.010s (~) 0.949s 10 1.44x
🌐 MongoDB Next.js (Turbopack) 0.105s 1.007s 0.902s 10 2.51x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 0.197s (-17.1% 🟢) 1.825s (-20.0% 🟢) 1.628s 10 1.00x
▲ Vercel Next.js (Turbopack) 0.245s (-50.1% 🟢) 2.053s (-20.4% 🟢) 1.808s 10 1.24x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 1 step

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 1.111s 2.006s 0.895s 10 1.00x
💻 Local Nitro 1.127s (~) 2.005s (~) 0.879s 10 1.01x
💻 Local Express 1.132s (~) 2.007s (~) 0.874s 10 1.02x
🌐 Redis Next.js (Turbopack) 1.132s 2.006s 0.874s 10 1.02x
🐘 Postgres Express 1.133s (-1.6%) 2.008s (~) 0.875s 10 1.02x
🐘 Postgres Next.js (Turbopack) 1.134s 2.007s 0.874s 10 1.02x
🐘 Postgres Nitro 1.145s (+0.6%) 2.010s (~) 0.865s 10 1.03x
🌐 MongoDB Next.js (Turbopack) 1.311s 2.008s 0.697s 10 1.18x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.777s (-35.3% 🟢) 3.690s (-19.9% 🟢) 1.913s 10 1.00x
▲ Vercel Next.js (Turbopack) 1.984s (+7.0% 🔺) 3.206s (-14.1% 🟢) 1.223s 10 1.12x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 10 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 10.796s 11.023s 0.227s 3 1.00x
🌐 Redis Next.js (Turbopack) 10.805s 11.023s 0.218s 3 1.00x
🐘 Postgres Next.js (Turbopack) 10.805s 11.018s 0.212s 3 1.00x
🐘 Postgres Nitro 10.871s (~) 11.019s (~) 0.148s 3 1.01x
🐘 Postgres Express 10.900s (~) 11.015s (~) 0.115s 3 1.01x
💻 Local Nitro 10.909s (~) 11.023s (~) 0.114s 3 1.01x
💻 Local Express 10.931s (~) 11.023s (~) 0.093s 3 1.01x
🌐 MongoDB Next.js (Turbopack) 12.278s 13.023s 0.745s 3 1.14x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 16.225s (-10.7% 🟢) 18.031s (-8.9% 🟢) 1.806s 2 1.00x
▲ Vercel Next.js (Turbopack) 16.451s (-22.4% 🟢) 17.798s (-23.5% 🟢) 1.347s 2 1.01x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 25 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 14.282s 15.031s 0.749s 4 1.00x
🐘 Postgres Next.js (Turbopack) 14.402s 15.022s 0.621s 4 1.01x
🐘 Postgres Express 14.459s (-0.9%) 15.021s (~) 0.562s 4 1.01x
🐘 Postgres Nitro 14.626s (~) 15.021s (~) 0.395s 4 1.02x
💻 Local Next.js (Turbopack) 14.710s 15.030s 0.320s 4 1.03x
💻 Local Nitro 14.923s (~) 15.029s (~) 0.106s 4 1.04x
💻 Local Express 14.950s (~) 15.029s (~) 0.078s 4 1.05x
🌐 MongoDB Next.js (Turbopack) 17.859s 18.027s 0.168s 4 1.25x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 31.393s (+0.9%) 33.453s (+1.7%) 2.060s 2 1.00x
▲ Vercel Next.js (Turbopack) 35.493s (+6.0% 🔺) 36.715s (+3.1%) 1.222s 2 1.13x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 50 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 13.593s 14.027s 0.433s 7 1.00x
🐘 Postgres Next.js (Turbopack) 13.618s 14.022s 0.404s 7 1.00x
🐘 Postgres Express 13.776s (-2.9%) 14.019s (-5.8% 🟢) 0.243s 7 1.01x
🐘 Postgres Nitro 14.170s (+1.2%) 15.026s (+6.1% 🔺) 0.855s 6 1.04x
💻 Local Next.js (Turbopack) 16.093s 16.364s 0.270s 6 1.18x
💻 Local Express 16.562s (-1.4%) 17.030s (~) 0.468s 6 1.22x
💻 Local Nitro 16.760s (+1.2%) 17.031s (~) 0.271s 6 1.23x
🌐 MongoDB Next.js (Turbopack) 20.427s 21.029s 0.602s 5 1.50x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 57.584s (+6.7% 🔺) 59.558s (+6.8% 🔺) 1.974s 2 1.00x
▲ Vercel Next.js (Turbopack) 65.016s (+17.1% 🔺) 66.582s (+16.1% 🔺) 1.566s 2 1.13x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Promise.all with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 1.242s 2.009s 0.767s 15 1.00x
🐘 Postgres Express 1.263s (-1.8%) 2.010s (~) 0.747s 15 1.02x
🐘 Postgres Nitro 1.270s (+0.9%) 2.010s (~) 0.741s 15 1.02x
🌐 Redis Next.js (Turbopack) 1.292s 2.006s 0.715s 15 1.04x
💻 Local Express 1.510s (-1.6%) 2.006s (~) 0.497s 15 1.22x
💻 Local Nitro 1.524s (-1.9%) 2.006s (~) 0.482s 15 1.23x
💻 Local Next.js (Turbopack) 1.636s 2.073s 0.437s 15 1.32x
🌐 MongoDB Next.js (Turbopack) 2.793s 3.309s 0.516s 10 2.25x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.428s (-13.9% 🟢) 3.854s (-14.2% 🟢) 1.425s 8 1.00x
▲ Vercel Nitro 2.683s (+5.0%) 4.296s (+2.3%) 1.613s 7 1.11x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Promise.all with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 2.318s (~) 3.008s (~) 0.689s 10 1.00x
🐘 Postgres Nitro 2.325s (~) 3.010s (~) 0.685s 10 1.00x
🐘 Postgres Next.js (Turbopack) 2.391s 3.008s 0.617s 10 1.03x
🌐 Redis Next.js (Turbopack) 2.576s 3.007s 0.431s 10 1.11x
💻 Local Nitro 2.897s (-7.3% 🟢) 3.108s (-20.0% 🟢) 0.211s 10 1.25x
💻 Local Next.js (Turbopack) 2.951s 3.565s 0.614s 9 1.27x
💻 Local Express 3.058s (-2.3%) 3.675s (-5.4% 🟢) 0.618s 9 1.32x
🌐 MongoDB Next.js (Turbopack) 9.630s 10.015s 0.385s 4 4.15x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.691s (-18.5% 🟢) 4.063s (-24.9% 🟢) 1.372s 8 1.00x
▲ Vercel Nitro 2.698s (+3.2%) 4.490s (+6.0% 🔺) 1.792s 7 1.00x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Promise.all with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 3.446s (-1.2%) 4.012s (~) 0.566s 8 1.00x
🐘 Postgres Express 3.493s (~) 4.010s (~) 0.517s 8 1.01x
🐘 Postgres Next.js (Turbopack) 3.658s 4.011s 0.353s 8 1.06x
🌐 Redis Next.js (Turbopack) 4.170s 4.868s 0.698s 7 1.21x
💻 Local Next.js (Turbopack) 7.460s 7.766s 0.307s 4 2.16x
💻 Local Express 8.155s (-2.2%) 9.025s (~) 0.870s 4 2.37x
💻 Local Nitro 8.270s (+1.4%) 9.020s (+2.9%) 0.751s 4 2.40x
🌐 MongoDB Next.js (Turbopack) 20.095s 20.532s 0.437s 2 5.83x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.673s (-19.8% 🟢) 4.112s (-18.0% 🟢) 1.439s 8 1.00x
▲ Vercel Next.js (Turbopack) 3.716s (-17.1% 🟢) 5.114s (-16.7% 🟢) 1.398s 6 1.39x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Promise.race with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 1.210s 2.008s 0.798s 15 1.00x
🐘 Postgres Express 1.263s (~) 2.007s (~) 0.744s 15 1.04x
🐘 Postgres Nitro 1.264s (+1.7%) 2.008s (~) 0.744s 15 1.04x
🌐 Redis Next.js (Turbopack) 1.308s 2.006s 0.698s 15 1.08x
💻 Local Next.js (Turbopack) 1.532s 2.005s 0.473s 15 1.27x
💻 Local Express 1.562s (+1.7%) 2.006s (~) 0.444s 15 1.29x
💻 Local Nitro 1.577s (+2.4%) 2.073s (+3.3%) 0.496s 15 1.30x
🌐 MongoDB Next.js (Turbopack) 3.112s 3.884s 0.772s 8 2.57x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.007s (-5.1% 🟢) 3.205s (-21.1% 🟢) 1.198s 10 1.00x
▲ Vercel Nitro 2.109s (-11.7% 🟢) 3.804s (-5.1% 🟢) 1.695s 8 1.05x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Promise.race with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 2.309s (-0.5%) 3.009s (~) 0.701s 10 1.00x
🐘 Postgres Express 2.322s (-1.4%) 3.009s (~) 0.686s 10 1.01x
🐘 Postgres Next.js (Turbopack) 2.379s 3.009s 0.630s 10 1.03x
🌐 Redis Next.js (Turbopack) 2.547s 3.007s 0.460s 10 1.10x
💻 Local Next.js (Turbopack) 2.869s 3.565s 0.696s 9 1.24x
💻 Local Express 3.028s (+2.3%) 3.762s (~) 0.734s 8 1.31x
💻 Local Nitro 3.119s (+5.0%) 3.885s (+5.7% 🔺) 0.766s 8 1.35x
🌐 MongoDB Next.js (Turbopack) 9.589s 9.767s 0.178s 4 4.15x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.306s (-14.1% 🟢) 3.526s (-20.1% 🟢) 1.219s 9 1.00x
▲ Vercel Nitro 2.340s (-10.9% 🟢) 3.968s (-2.8%) 1.628s 8 1.01x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Promise.race with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 3.466s (~) 4.010s (~) 0.543s 8 1.00x
🐘 Postgres Express 3.499s (~) 4.011s (~) 0.512s 8 1.01x
🐘 Postgres Next.js (Turbopack) 3.620s 4.012s 0.391s 8 1.04x
🌐 Redis Next.js (Turbopack) 4.164s 5.011s 0.847s 6 1.20x
💻 Local Next.js (Turbopack) 8.329s 9.018s 0.689s 4 2.40x
💻 Local Nitro 8.743s (-1.0%) 9.022s (-5.2% 🟢) 0.279s 4 2.52x
💻 Local Express 8.810s (+2.8%) 9.026s (~) 0.217s 4 2.54x
🌐 MongoDB Next.js (Turbopack) 20.407s 21.030s 0.622s 2 5.89x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.096s (+12.4% 🔺) 4.548s (-1.1%) 1.452s 8 1.00x
▲ Vercel Next.js (Turbopack) 4.238s (-17.3% 🟢) 5.474s (-19.1% 🟢) 1.236s 6 1.37x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 10 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 0.733s 1.021s 0.288s 59 1.00x
🐘 Postgres Next.js (Turbopack) 0.760s 1.006s 0.246s 60 1.04x
🐘 Postgres Express 0.830s (-3.7%) 1.022s (~) 0.192s 59 1.13x
🐘 Postgres Nitro 0.840s (+0.7%) 1.023s (+1.6%) 0.183s 59 1.15x
💻 Local Next.js (Turbopack) 0.863s 1.022s 0.159s 59 1.18x
💻 Local Express 0.986s (+1.2%) 1.229s (+14.3% 🔺) 0.243s 49 1.35x
💻 Local Nitro 1.079s (+11.1% 🔺) 1.205s (+12.0% 🔺) 0.125s 50 1.47x
🌐 MongoDB Next.js (Turbopack) 2.138s 3.008s 0.870s 20 2.92x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 8.671s (-1.3%) 10.227s (-5.0% 🟢) 1.556s 6 1.00x
▲ Vercel Next.js (Turbopack) 8.905s (-5.6% 🟢) 10.249s (-10.6% 🟢) 1.344s 6 1.03x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 25 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.728s 2.006s 0.278s 45 1.00x
🐘 Postgres Next.js (Turbopack) 1.879s 2.075s 0.196s 44 1.09x
🐘 Postgres Express 1.998s (-4.6%) 2.375s (-18.5% 🟢) 0.377s 38 1.16x
🐘 Postgres Nitro 2.025s (+2.9%) 2.686s (+19.0% 🔺) 0.661s 34 1.17x
💻 Local Next.js (Turbopack) 2.740s 3.041s 0.301s 30 1.59x
💻 Local Express 2.985s (-0.8%) 3.342s (-5.8% 🟢) 0.357s 27 1.73x
💻 Local Nitro 3.000s (-6.7% 🟢) 3.470s (-7.7% 🟢) 0.470s 26 1.74x
🌐 MongoDB Next.js (Turbopack) 5.296s 6.012s 0.716s 15 3.06x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 26.182s (-1.6%) 27.892s (-2.0%) 1.710s 4 1.00x
▲ Vercel Next.js (Turbopack) 28.549s (~) 29.696s (-3.1%) 1.146s 4 1.09x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 50 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 3.457s 4.008s 0.551s 30 1.00x
🐘 Postgres Next.js (Turbopack) 3.784s 4.043s 0.260s 30 1.09x
🐘 Postgres Express 3.921s (-6.9% 🟢) 4.251s (-14.5% 🟢) 0.331s 29 1.13x
🐘 Postgres Nitro 4.146s (+3.6%) 5.010s (+15.7% 🔺) 0.864s 24 1.20x
💻 Local Next.js (Turbopack) 8.654s 9.017s 0.364s 14 2.50x
💻 Local Express 9.057s (~) 9.555s (~) 0.498s 13 2.62x
💻 Local Nitro 9.088s (+0.6%) 9.632s (+0.8%) 0.545s 13 2.63x
🌐 MongoDB Next.js (Turbopack) 10.388s 11.019s 0.631s 11 3.00x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 72.740s (-8.9% 🟢) 74.687s (-8.5% 🟢) 1.947s 2 1.00x
▲ Vercel Next.js (Turbopack) 75.544s (+2.1%) 76.839s (+1.3%) 1.295s 2 1.04x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 10 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.253s 1.007s 0.753s 60 1.00x
🐘 Postgres Express 0.275s (-4.4%) 1.006s (~) 0.731s 60 1.09x
🐘 Postgres Nitro 0.276s (-0.6%) 1.007s (~) 0.730s 60 1.09x
🌐 Redis Next.js (Turbopack) 0.323s 1.021s 0.698s 59 1.27x
💻 Local Next.js (Turbopack) 0.569s 1.004s 0.435s 60 2.25x
💻 Local Express 0.570s (-0.5%) 1.004s (~) 0.434s 60 2.25x
💻 Local Nitro 0.596s (-3.0%) 1.004s (-1.7%) 0.408s 60 2.35x
🌐 MongoDB Next.js (Turbopack) 3.103s 3.946s 0.843s 16 12.25x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.318s (-27.1% 🟢) 2.837s (-13.9% 🟢) 1.518s 22 1.00x
▲ Vercel Next.js (Turbopack) 2.124s (+49.2% 🔺) 3.541s (+24.3% 🔺) 1.417s 17 1.61x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 25 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.471s 1.007s 0.535s 90 1.00x
🐘 Postgres Express 0.486s (-1.2%) 1.006s (~) 0.520s 90 1.03x
🐘 Postgres Nitro 0.490s (~) 1.006s (~) 0.516s 90 1.04x
🌐 Redis Next.js (Turbopack) 1.171s 2.006s 0.835s 45 2.48x
💻 Local Nitro 2.443s (-3.2%) 3.008s (-1.1%) 0.565s 30 5.18x
💻 Local Express 2.476s (-1.7%) 3.009s (~) 0.532s 30 5.25x
💻 Local Next.js (Turbopack) 2.612s 3.009s 0.398s 30 5.54x
🌐 MongoDB Next.js (Turbopack) 9.634s 10.014s 0.380s 9 20.43x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.729s (+1.8%) 4.417s (-2.7%) 1.688s 21 1.00x
▲ Vercel Next.js (Turbopack) 3.259s (+9.6% 🔺) 4.741s (+2.2%) 1.481s 21 1.19x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 50 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.738s 1.006s 0.268s 120 1.00x
🐘 Postgres Express 0.762s (-5.5% 🟢) 1.006s (~) 0.244s 120 1.03x
🐘 Postgres Nitro 0.782s (+0.7%) 1.007s (~) 0.225s 120 1.06x
🌐 Redis Next.js (Turbopack) 2.742s 3.033s 0.290s 40 3.71x
💻 Local Nitro 10.877s (-0.9%) 11.575s (-0.8%) 0.698s 11 14.73x
💻 Local Express 10.893s (-3.6%) 11.481s (-3.1%) 0.588s 11 14.75x
💻 Local Next.js (Turbopack) 10.911s 11.480s 0.569s 11 14.78x
🌐 MongoDB Next.js (Turbopack) 19.911s 20.359s 0.448s 6 26.97x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 6.785s (+6.3% 🔺) 8.099s (+0.9%) 1.314s 15 1.00x
▲ Vercel Nitro 7.859s (+19.5% 🔺) 9.682s (+18.5% 🔺) 1.823s 13 1.16x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Stream Benchmarks (includes TTFB metrics)
workflow with stream

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.183s 1.003s 0.012s 1.018s 0.835s 10 1.00x
🌐 Redis Next.js (Turbopack) 0.183s 1.001s 0.001s 1.007s 0.824s 10 1.00x
🐘 Postgres Next.js (Turbopack) 0.191s 1.001s 0.001s 1.010s 0.819s 10 1.04x
🐘 Postgres Express 0.196s (-10.0% 🟢) 0.995s (~) 0.002s (+23.1% 🔺) 1.009s (~) 0.813s 10 1.07x
💻 Local Express 0.203s (-3.3%) 1.004s (~) 0.011s (-1.7%) 1.017s (~) 0.815s 10 1.11x
💻 Local Nitro 0.203s (-4.2%) 1.004s (~) 0.011s (-4.2%) 1.017s (~) 0.814s 10 1.11x
🐘 Postgres Nitro 0.215s (+6.0% 🔺) 0.995s (~) 0.001s (-7.7% 🟢) 1.010s (~) 0.795s 10 1.18x
🌐 MongoDB Next.js (Turbopack) 0.514s 0.939s 0.001s 1.008s 0.494s 10 2.81x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.325s (-19.8% 🟢) 2.335s (-24.9% 🟢) 0.780s (+84.0% 🔺) 3.575s (-9.7% 🟢) 2.250s 10 1.00x
▲ Vercel Next.js (Turbopack) 1.587s (-3.4%) 2.754s (-5.0%) 0.624s (-4.5%) 3.710s (-8.3% 🟢) 2.122s 10 1.20x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

stream pipeline with 5 transform steps (1MB)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 0.506s 1.002s 0.004s 1.012s 0.506s 60 1.00x
🐘 Postgres Express 0.588s (-3.0%) 1.004s (~) 0.004s (+4.1%) 1.022s (~) 0.435s 59 1.16x
🐘 Postgres Next.js (Turbopack) 0.593s 1.009s 0.005s 1.022s 0.429s 59 1.17x
🐘 Postgres Nitro 0.604s (-2.4%) 1.005s (~) 0.004s (+44.0% 🔺) 1.021s (~) 0.417s 59 1.19x
💻 Local Next.js (Turbopack) 0.661s 1.011s 0.009s 1.024s 0.362s 59 1.31x
💻 Local Nitro 0.722s (+1.2%) 1.012s (~) 0.009s (+7.9% 🔺) 1.023s (~) 0.301s 59 1.43x
💻 Local Express 0.815s (-13.5% 🟢) 1.012s (~) 0.009s (+4.3%) 1.115s (-9.2% 🟢) 0.300s 54 1.61x
🌐 MongoDB Next.js (Turbopack) 1.325s 1.955s 0.003s 2.012s 0.687s 30 2.62x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.904s (-3.1%) 5.652s (+4.8%) 0.250s (-32.3% 🟢) 6.347s (+3.1%) 2.442s 10 1.00x
▲ Vercel Next.js (Turbopack) 82.347s (+1675.1% 🔺) 82.272s (+1197.2% 🔺) 0.932s (+277.0% 🔺) 84.597s (+1091.5% 🔺) 2.249s 2 21.09x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

10 parallel streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.913s 1.112s 0.000s 1.118s 0.205s 54 1.00x
🌐 Redis Next.js (Turbopack) 0.925s 1.072s 0.000s 1.076s 0.150s 56 1.01x
🐘 Postgres Nitro 0.935s (-1.5%) 1.103s (-7.8% 🟢) 0.000s (-9.1% 🟢) 1.117s (-7.6% 🟢) 0.182s 55 1.02x
🐘 Postgres Express 0.965s (+1.3%) 1.170s (~) 0.000s (+104.0% 🔺) 1.205s (+1.7%) 0.240s 50 1.06x
💻 Local Express 1.233s (-14.0% 🟢) 2.020s (~) 0.000s (+28.3% 🔺) 2.022s (-8.1% 🟢) 0.789s 30 1.35x
💻 Local Nitro 1.233s (-0.9%) 2.021s (~) 0.000s (-14.3% 🟢) 2.023s (~) 0.790s 30 1.35x
💻 Local Next.js (Turbopack) 1.287s 2.021s 0.000s 2.024s 0.737s 30 1.41x
🌐 MongoDB Next.js (Turbopack) 2.361s 2.944s 0.000s 3.008s 0.647s 20 2.58x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.715s (+5.2% 🔺) 4.167s (+11.2% 🔺) 0.000s (NaN%) 4.616s (+10.6% 🔺) 1.901s 14 1.00x
▲ Vercel Next.js (Turbopack) 3.117s (-2.7%) 4.253s (-8.8% 🟢) 0.000s (-14.3% 🟢) 4.642s (-9.0% 🟢) 1.525s 14 1.15x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

fan-out fan-in 10 streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.704s 2.034s 0.000s 2.039s 0.336s 30 1.00x
🐘 Postgres Express 1.733s (-1.9%) 2.100s (+1.3%) 0.000s (+55.4% 🔺) 2.152s (+2.1%) 0.419s 28 1.02x
🐘 Postgres Nitro 1.737s (-1.9%) 2.100s (~) 0.000s (NaN%) 2.112s (-1.4%) 0.375s 29 1.02x
🐘 Postgres Next.js (Turbopack) 1.793s 2.104s 0.000s 2.122s 0.329s 29 1.05x
💻 Local Nitro 3.399s (-14.0% 🟢) 4.032s (-3.5%) 0.000s (-44.0% 🟢) 4.034s (-11.1% 🟢) 0.635s 15 2.00x
💻 Local Next.js (Turbopack) 3.712s 4.100s 0.001s 4.105s 0.394s 15 2.18x
💻 Local Express 3.880s (+7.8% 🔺) 4.179s (+2.0%) 0.001s (+42.9% 🔺) 4.539s (+10.7% 🔺) 0.659s 14 2.28x
🌐 MongoDB Next.js (Turbopack) 4.386s 4.954s 0.000s 5.010s 0.624s 12 2.57x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 4.043s (+11.1% 🔺) 5.577s (+16.1% 🔺) 0.006s (+6663.6% 🔺) 6.025s (+14.6% 🔺) 1.982s 11 1.00x
▲ Vercel Next.js (Turbopack) 4.664s (+15.3% 🔺) 5.738s (~) 0.000s (-100.0% 🟢) 6.057s (-1.5%) 1.393s 10 1.15x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Summary

Fastest Framework by World

Winner determined by most benchmark wins

World 🥇 Fastest Framework Wins
💻 Local Next.js (Turbopack) 14/21
🐘 Postgres Next.js (Turbopack) 14/21
▲ Vercel Nitro 16/21
Fastest World by Framework

Winner determined by most benchmark wins

Framework 🥇 Fastest World Wins
Express 🐘 Postgres 19/21
Next.js (Turbopack) 🐘 Postgres 9/21
Nitro 🐘 Postgres 16/21
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • TTFB: Time to First Byte - time from workflow start until first stream byte received (stream benchmarks only)
  • Slurp: Time from first byte to complete stream consumption (stream benchmarks only)
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • Samples: Number of benchmark iterations run
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Worlds:

  • 💻 Local: In-memory filesystem world (local development)
  • 🐘 Postgres: PostgreSQL database world (local development)
  • ▲ Vercel: Vercel production/preview deployment
  • 🌐 Turso: Community world (local development)
  • 🌐 MongoDB: Community world (local development)
  • 🌐 Redis: Community world (local development)
  • 🌐 Jazz: Community world (local development)

📋 View full workflow run


Some benchmark jobs failed:

  • Local: success
  • Postgres: success
  • Vercel: failure

Check the workflow run for details.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 6, 2026

🧪 E2E Test Results

Some tests failed

Summary

Passed Failed Skipped Total
✅ ▲ Vercel Production 879 0 67 946
❌ 💻 Local Development 853 1 178 1032
✅ 📦 Local Production 854 0 178 1032
✅ 🐘 Local Postgres 854 0 178 1032
✅ 🪟 Windows 78 0 8 86
❌ 🌍 Community Worlds 134 64 24 222
✅ 📋 Other 216 0 42 258
Total 3868 65 675 4608

❌ Failed Tests

💻 Local Development (1 failed)

sveltekit-stable (1 failed):

  • hookCleanupTestWorkflow - hook token reuse after workflow completion | wrun_01KNJGC7TAT1T5SPRDF0EVAXP8
🌍 Community Worlds (64 failed)

mongodb (4 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KNJG4XPX62KWQTETA70S80GR
  • webhookWorkflow | wrun_01KNJG5AHGPKR5SCKCY3E6Y7GY
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KNJGCWB3XA7XX6DZSWGV900M
  • resilient start: addTenWorkflow completes when run_created returns 500 | wrun_01KNJGKR0GH2M98DQ5F6GJ1Q2E

redis (3 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KNJG4XPX62KWQTETA70S80GR
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KNJGCWB3XA7XX6DZSWGV900M
  • resilient start: addTenWorkflow completes when run_created returns 500 | wrun_01KNJGKR0GH2M98DQ5F6GJ1Q2E

turso (57 failed):

  • addTenWorkflow | wrun_01KNJG3RKPE4W22S0GFPNBJ45C
  • addTenWorkflow | wrun_01KNJG3RKPE4W22S0GFPNBJ45C
  • wellKnownAgentWorkflow (.well-known/agent) | wrun_01KNJG4YHFR9QEERDS23AT5DNQ
  • should work with react rendering in step
  • promiseAllWorkflow | wrun_01KNJG3ZAF2V6RC5F90GTG544V
  • promiseRaceWorkflow | wrun_01KNJG43RYMXWEQ8W34HBBVPB5
  • promiseAnyWorkflow | wrun_01KNJG4666ND5AJAEQA95KKZXZ
  • importedStepOnlyWorkflow | wrun_01KNJG5ANDDSFN0TYKAYHV5RDF
  • hookWorkflow | wrun_01KNJG4J701YK8NPH50690KAPM
  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KNJG4XPX62KWQTETA70S80GR
  • webhookWorkflow | wrun_01KNJG5AHGPKR5SCKCY3E6Y7GY
  • sleepingWorkflow | wrun_01KNJG5GZ8HEW4W17ZR7RAWHB7
  • parallelSleepWorkflow | wrun_01KNJG5WJTY5B0B8HARRZPWKD6
  • nullByteWorkflow | wrun_01KNJG615BH7YCY8BV4T6QEK9P
  • workflowAndStepMetadataWorkflow | wrun_01KNJG63GZSNZQSN4DWZPRQ5C4
  • fetchWorkflow | wrun_01KNJG8QR04CZ8HWXA8B5R5BH8
  • promiseRaceStressTestWorkflow | wrun_01KNJG8T6CR6A3GB0R9FH8MFZ2
  • error handling error propagation workflow errors nested function calls preserve message and stack trace
  • error handling error propagation workflow errors cross-file imports preserve message and stack trace
  • error handling error propagation step errors basic step error preserves message and stack trace
  • error handling error propagation step errors cross-file step error preserves message and function names in stack
  • error handling retry behavior regular Error retries until success
  • error handling retry behavior FatalError fails immediately without retries
  • error handling retry behavior RetryableError respects custom retryAfter delay
  • error handling retry behavior maxRetries=0 disables retries
  • error handling catchability FatalError can be caught and detected with FatalError.is()
  • error handling not registered WorkflowNotRegisteredError fails the run when workflow does not exist
  • error handling not registered StepNotRegisteredError fails the step but workflow can catch it
  • error handling not registered StepNotRegisteredError fails the run when not caught in workflow
  • hookCleanupTestWorkflow - hook token reuse after workflow completion | wrun_01KNJGC7TAT1T5SPRDF0EVAXP8
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KNJGCWB3XA7XX6DZSWGV900M
  • hookDisposeTestWorkflow - hook token reuse after explicit disposal while workflow still running | wrun_01KNJGDJHMQ59JP2943VG4X3EC
  • stepFunctionPassingWorkflow - step function references can be passed as arguments (without closure vars) | wrun_01KNJGE80KJGT485WP8VK476SA
  • stepFunctionWithClosureWorkflow - step function with closure variables passed as argument | wrun_01KNJGEHFD4QJM74ZC3Y9HTAVA
  • closureVariableWorkflow - nested step functions with closure variables | wrun_01KNJGEQ1TNFBX3XMGJCQSB02X
  • spawnWorkflowFromStepWorkflow - spawning a child workflow using start() inside a step | wrun_01KNJGES7TH7E0Z4W9RGEGG5SZ
  • health check (queue-based) - workflow and step endpoints respond to health check messages
  • pathsAliasWorkflow - TypeScript path aliases resolve correctly | wrun_01KNJGF8TQV4MNXPTGB7RHMBC2
  • Calculator.calculate - static workflow method using static step methods from another class | wrun_01KNJGFEPRNZWRTQKDQ541R6AA
  • AllInOneService.processNumber - static workflow method using sibling static step methods | wrun_01KNJGFM4H748QCF8ET5KEDED0
  • ChainableService.processWithThis - static step methods using this to reference the class | wrun_01KNJGG8Z75TZRYFS8WTPYSBPF
  • thisSerializationWorkflow - step function invoked with .call() and .apply() | wrun_01KNJGGFT18PBRDHJKPQP1KVSY
  • customSerializationWorkflow - custom class serialization with WORKFLOW_SERIALIZE/WORKFLOW_DESERIALIZE | wrun_01KNJGGPD9340WVV5X0FTYDMJX
  • instanceMethodStepWorkflow - instance methods with "use step" directive | wrun_01KNJGGX524X03TWC7GBTN1T9H
  • crossContextSerdeWorkflow - classes defined in step code are deserializable in workflow context | wrun_01KNJGH7AKW4DNJ1QBR72AYCGE
  • stepFunctionAsStartArgWorkflow - step function reference passed as start() argument | wrun_01KNJGHF55NV4GSCB4CR5PW7XB
  • cancelRun - cancelling a running workflow | wrun_01KNJGHR2F9DF0K2Z89ZG6VXQ4
  • cancelRun via CLI - cancelling a running workflow | wrun_01KNJGJ1PJ9T7S9E2VM4KX6622
  • pages router addTenWorkflow via pages router
  • pages router promiseAllWorkflow via pages router
  • pages router sleepingWorkflow via pages router
  • hookWithSleepWorkflow - hook payloads delivered correctly with concurrent sleep | wrun_01KNJGJDSCJGWW94ZF1C1EHDAB
  • sleepInLoopWorkflow - sleep inside loop with steps actually delays each iteration | wrun_01KNJGK2N33TC3M4G6VSHCS18N
  • sleepWithSequentialStepsWorkflow - sequential steps work with concurrent sleep (control) | wrun_01KNJGKCX8S4FVA79NSMA5CY0F
  • importMetaUrlWorkflow - import.meta.url is available in step bundles | wrun_01KNJGKKQTDFGCEXX8KXTWQGQB
  • metadataFromHelperWorkflow - getWorkflowMetadata/getStepMetadata work from module-level helper (#1577) | wrun_01KNJGKNXMZZ0PH3C04KHAXVFP
  • resilient start: addTenWorkflow completes when run_created returns 500 | wrun_01KNJGKR0GH2M98DQ5F6GJ1Q2E

Details by Category

✅ ▲ Vercel Production
App Passed Failed Skipped
✅ astro 79 0 7
✅ example 79 0 7
✅ express 79 0 7
✅ fastify 79 0 7
✅ hono 79 0 7
✅ nextjs-turbopack 84 0 2
✅ nextjs-webpack 84 0 2
✅ nitro 79 0 7
✅ nuxt 79 0 7
✅ sveltekit 79 0 7
✅ vite 79 0 7
❌ 💻 Local Development
App Passed Failed Skipped
✅ astro-stable 72 0 14
✅ express-stable 72 0 14
✅ fastify-stable 72 0 14
✅ hono-stable 72 0 14
✅ nextjs-turbopack-canary 61 0 25
✅ nextjs-turbopack-stable 78 0 8
✅ nextjs-webpack-canary 61 0 25
✅ nextjs-webpack-stable 78 0 8
✅ nitro-stable 72 0 14
✅ nuxt-stable 72 0 14
❌ sveltekit-stable 71 1 14
✅ vite-stable 72 0 14
✅ 📦 Local Production
App Passed Failed Skipped
✅ astro-stable 72 0 14
✅ express-stable 72 0 14
✅ fastify-stable 72 0 14
✅ hono-stable 72 0 14
✅ nextjs-turbopack-canary 61 0 25
✅ nextjs-turbopack-stable 78 0 8
✅ nextjs-webpack-canary 61 0 25
✅ nextjs-webpack-stable 78 0 8
✅ nitro-stable 72 0 14
✅ nuxt-stable 72 0 14
✅ sveltekit-stable 72 0 14
✅ vite-stable 72 0 14
✅ 🐘 Local Postgres
App Passed Failed Skipped
✅ astro-stable 72 0 14
✅ express-stable 72 0 14
✅ fastify-stable 72 0 14
✅ hono-stable 72 0 14
✅ nextjs-turbopack-canary 61 0 25
✅ nextjs-turbopack-stable 78 0 8
✅ nextjs-webpack-canary 61 0 25
✅ nextjs-webpack-stable 78 0 8
✅ nitro-stable 72 0 14
✅ nuxt-stable 72 0 14
✅ sveltekit-stable 72 0 14
✅ vite-stable 72 0 14
✅ 🪟 Windows
App Passed Failed Skipped
✅ nextjs-turbopack 78 0 8
❌ 🌍 Community Worlds
App Passed Failed Skipped
✅ mongodb-dev 5 0 0
❌ mongodb 57 4 8
✅ redis-dev 5 0 0
❌ redis 58 3 8
✅ turso-dev 5 0 0
❌ turso 4 57 8
✅ 📋 Other
App Passed Failed Skipped
✅ e2e-local-dev-nest-stable 72 0 14
✅ e2e-local-postgres-nest-stable 72 0 14
✅ e2e-local-prod-nest-stable 72 0 14

📋 View full workflow run


Some E2E test jobs failed:

  • Vercel Prod: success
  • Local Dev: failure
  • Local Prod: success
  • Local Postgres: success
  • Windows: success

Check the workflow run for details.

Copy link
Copy Markdown
Member

@TooTallNate TooTallNate left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make more sense to implement this in world-vercel's writeToStreamMulti() function instead?

Move batch-splitting from core into world-vercel where the 1000-chunk
server limit actually applies. writeToStreamMulti now sends in pages
of MAX_CHUNKS_PER_REQUEST (1000) to prevent 400 errors.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown
Member

@TooTallNate TooTallNate left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review

Clean, well-scoped fix for a real problem. The pagination logic is correct — sequential await preserves chunk ordering, and chunks.slice(i, i + MAX_CHUNKS_PER_REQUEST) correctly splits into pages. The fix is correctly placed in the transport layer where the server constraint lives.

One thing worth understanding before merging: this changes the atomicity characteristics of writeToStreamMulti for the world-vercel backend. See inline comment.

What looks good

  • Correct pagination: Sequential await in the loop preserves ordering. slice correctly pages the array.
  • Good placement: The limit is a world-vercel transport concern, not a core concern.
  • Empty-chunks early return (if (chunks.length === 0) return) is a minor behavioral change from the old code (which would send an empty PUT), but clearly an improvement.
  • Changeset is properly scoped to @workflow/world-vercel only.
  • Tests cover the key boundary cases (at limit, one over, and multi-page).

Nits (non-blocking)

See inline comments.

headers: httpConfig.headers,
}
);
await response.text();
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Partial failure concern (non-blocking, worth understanding)

The caller (WorkflowServerWritableStream.flush() in packages/core/src/serialization.ts:475) treats writeToStreamMulti as all-or-nothing: if the promise rejects, the buffer is retained; if it resolves, the buffer is cleared.

With pagination, if fetch throws a network error on page 2 of 3:

  • Page 1's chunks are already persisted on the server
  • The error propagates, so the caller retains the entire buffer (all 3 pages worth)
  • On the next flush attempt, page 1's chunks are re-sent and duplicated

This is a reasonable trade-off vs. the alternative (400 error on all >1000 chunk flushes), and it's a very unlikely scenario (sequential requests to the same endpoint within milliseconds). But it's worth documenting that atomicity is relaxed for large batches.

Separately, a pre-existing concern: none of the write methods (writeToStream, writeToStreamMulti, closeStream) check response.ok. The await response.text() just consumes the body. With pagination, this means a failed page (e.g. server 500) is silently swallowed and subsequent pages still get sent. Adding a guard like:

if (!response.ok) {
  const text = await response.text();
  throw new Error(`Stream write failed: HTTP ${response.status}: ${text}`);
}

would at least fail fast, which would be consistent with readFromStream and listStreamsByRunId (which do check response.ok).

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added response.ok checks to all three write methods (writeToStream, writeToStreamMulti, closeStream) — now consistent with the read methods. Also added a comment documenting the relaxed atomicity for multi-page batches.

MAX_CHUNKS_PER_REQUEST,
} from './streamer.js';

vi.mock('./utils.js', () => ({
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: This module-level vi.mock now applies to all tests in the file, including the pre-existing encodeMultiChunks tests that don't use getHttpConfig. Not harmful (those tests are pure functions), but it means future tests added to this file inherit the mock silently. Consider scoping it inside the writeToStreamMulti pagination describe block instead.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved the vi.mock call to right above the pagination describe block with a comment explaining that vitest hoists it regardless of placement, so it cannot be truly scoped — but the intent is now clear and the encodeMultiChunks tests are unaffected (pure functions).

await streamer.writeToStreamMulti?.('s', 'run-1', chunks);

// 3 requests: MAX, MAX, 5
expect(bodies).toHaveLength(3);
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: This test verifies the correct number of requests (3) but doesn't verify the chunk counts per page. Decoding each body and asserting the chunk counts (1000, 1000, 5) would make this more robust — otherwise the test would pass even if the split was wrong (e.g. 1500, 500, 5).

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated the test to decode each request body and assert chunk counts per page: [MAX_CHUNKS_PER_REQUEST, MAX_CHUNKS_PER_REQUEST, 5].

Copy link
Copy Markdown
Member

@TooTallNate TooTallNate left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving — the pagination logic is correct, the fix is well-scoped, and the non-blocking nits from my earlier review still apply but aren't blockers.

- Add response.ok guards to writeToStream, writeToStreamMulti, and
  closeStream — consistent with readFromStream/listStreamsByRunId
- Document relaxed atomicity for multi-page batches
- Move vi.mock next to pagination tests with comment about hoisting
- Verify per-page chunk counts (MAX, MAX, 5) not just request count

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@VaguelySerious VaguelySerious enabled auto-merge (squash) April 6, 2026 22:54
@VaguelySerious VaguelySerious disabled auto-merge April 6, 2026 23:11
@VaguelySerious VaguelySerious merged commit 5b9eb40 into main Apr 6, 2026
100 of 103 checks passed
@VaguelySerious VaguelySerious deleted the peter/max-chunks-per-flush branch April 6, 2026 23:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants