-
Notifications
You must be signed in to change notification settings - Fork 122
add benchmarking #460
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
add benchmarking #460
Conversation
|
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
c84c8bc to
13b0aa9
Compare
13b0aa9 to
441623c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Additional Suggestions:
- The GET handler is missing workflow metadata headers required by the benchmarking system to calculate execution times.
View Details
📝 Patch Details
diff --git a/workbench/sveltekit/src/routes/api/trigger/+server.ts b/workbench/sveltekit/src/routes/api/trigger/+server.ts
index f1017f6..77fab65 100644
--- a/workbench/sveltekit/src/routes/api/trigger/+server.ts
+++ b/workbench/sveltekit/src/routes/api/trigger/+server.ts
@@ -99,13 +99,25 @@ export const GET: RequestHandler = async ({ request }) => {
const run = getRun(runId);
const returnValue = await run.returnValue;
console.log('Return value:', returnValue);
+
+ // Include run metadata in headers
+ const [createdAt, startedAt, completedAt] = await Promise.all([
+ run.createdAt,
+ run.startedAt,
+ run.completedAt,
+ ]);
+ const headers: HeadersInit =
+ returnValue instanceof ReadableStream
+ ? { 'Content-Type': 'application/octet-stream' }
+ : {};
+
+ headers['X-Workflow-Run-Created-At'] = createdAt?.toISOString() || '';
+ headers['X-Workflow-Run-Started-At'] = startedAt?.toISOString() || '';
+ headers['X-Workflow-Run-Completed-At'] = completedAt?.toISOString() || '';
+
return returnValue instanceof ReadableStream
- ? new Response(returnValue, {
- headers: {
- 'Content-Type': 'application/octet-stream',
- },
- })
- : Response.json(returnValue);
+ ? new Response(returnValue, { headers })
+ : Response.json(returnValue, { headers });
} catch (error) {
if (error instanceof Error) {
if (WorkflowRunNotCompletedError.is(error)) {
Analysis
Missing workflow metadata headers in SvelteKit GET handler breaks benchmark timing calculations
What fails: SvelteKit's GET handler in /workbench/sveltekit/src/routes/api/trigger/+server.ts does not extract or set workflow metadata headers (X-Workflow-Run-Created-At, X-Workflow-Run-Started-At, X-Workflow-Run-Completed-At), preventing the benchmark system from calculating execution times.
How to reproduce:
- Run the benchmark suite against the SvelteKit workbench:
pnpm bench - Observe the resulting
bench-timings-*.jsonfile - theexecutionTimeMsfield will be missing/undefined for SvelteKit workflows
Result: Benchmark records have null createdAt/completedAt timestamps and no executionTimeMs calculated. The benchmark code in packages/core/e2e/bench.bench.ts (lines 79-85) extracts these headers and uses them to calculate execution time (lines 163-167), but without the headers, executionTimeMs remains undefined.
Expected: All framework implementations (Express, Hono, Next.js-turbopack, Nitro-v3) set these headers in their GET handler responses, allowing the benchmark to properly calculate executionTimeMs = completedAt - createdAt.
Fix implemented: Added the same header extraction and response code to SvelteKit's GET handler, extracting run.createdAt, run.startedAt, and run.completedAt timestamps and setting them as ISO string headers in the response, matching the pattern used in all other framework implementations.
View Details
📝 Patch Details
diff --git a/workbench/vite/routes/api/trigger.get.ts b/workbench/vite/routes/api/trigger.get.ts
index a7ef468..4487105 100644
--- a/workbench/vite/routes/api/trigger.get.ts
+++ b/workbench/vite/routes/api/trigger.get.ts
@@ -38,13 +38,25 @@ export default async ({ url }: { req: Request; url: URL }) => {
const run = getRun(runId);
const returnValue = await run.returnValue;
console.log('Return value:', returnValue);
+
+ // Include run metadata in headers
+ const [createdAt, startedAt, completedAt] = await Promise.all([
+ run.createdAt,
+ run.startedAt,
+ run.completedAt,
+ ]);
+ const headers: HeadersInit =
+ returnValue instanceof ReadableStream
+ ? { 'Content-Type': 'application/octet-stream' }
+ : {};
+
+ headers['X-Workflow-Run-Created-At'] = createdAt?.toISOString() || '';
+ headers['X-Workflow-Run-Started-At'] = startedAt?.toISOString() || '';
+ headers['X-Workflow-Run-Completed-At'] = completedAt?.toISOString() || '';
+
return returnValue instanceof ReadableStream
- ? new Response(returnValue, {
- headers: {
- 'Content-Type': 'application/octet-stream',
- },
- })
- : Response.json(returnValue);
+ ? new Response(returnValue, { headers })
+ : Response.json(returnValue, { headers });
} catch (error) {
if (error instanceof Error) {
if (WorkflowRunNotCompletedError.is(error)) {
Analysis
Missing workflow timing metadata headers in Vite trigger endpoint
What fails: GET handler in workbench/vite/routes/api/trigger.get.ts returns responses without the X-Workflow-Run-Created-At, X-Workflow-Run-Started-At, and X-Workflow-Run-Completed-At headers required by the benchmarking system.
How to reproduce:
# Build and run Vite workbench, then:
curl -s "http://localhost:5173/api/trigger?runId=<completed-run-id>" \
-H "Accept: application/json" | grep -i "X-Workflow"
# Returns no headers with timing metadataResult: Response headers are missing timing metadata. The benchmark code in packages/core/e2e/bench.bench.ts (lines 79-85) attempts to extract these headers to calculate execution times, but gets null values instead, leaving executionTimeMs undefined.
Expected: All frameworks (Express, Hono, Next.js Turbopack, Nitro v3) include these headers in their GET response. The Vite implementation should match this pattern to enable benchmark timing calculations.
Reference implementations:
workbench/express/src/index.ts- Usesres.setHeader()to set timing headersworkbench/hono/src/index.ts- Includes timing headers in responseworkbench/nextjs-turbopack/app/api/trigger/route.ts- Sets timing headers in responseworkbench/nitro-v3/routes/api/trigger.get.ts- Sets timing headers in response
View Details
📝 Patch Details
diff --git a/workbench/nextjs-webpack/app/api/trigger/route.ts b/workbench/nextjs-webpack/app/api/trigger/route.ts
index d456735..e1e0e0a 100644
--- a/workbench/nextjs-webpack/app/api/trigger/route.ts
+++ b/workbench/nextjs-webpack/app/api/trigger/route.ts
@@ -97,13 +97,25 @@ export async function GET(req: Request) {
const run = getRun(runId);
const returnValue = await run.returnValue;
console.log('Return value:', returnValue);
+
+ // Include run metadata in headers
+ const [createdAt, startedAt, completedAt] = await Promise.all([
+ run.createdAt,
+ run.startedAt,
+ run.completedAt,
+ ]);
+ const headers: HeadersInit =
+ returnValue instanceof ReadableStream
+ ? { 'Content-Type': 'application/octet-stream' }
+ : {};
+
+ headers['X-Workflow-Run-Created-At'] = createdAt?.toISOString() || '';
+ headers['X-Workflow-Run-Started-At'] = startedAt?.toISOString() || '';
+ headers['X-Workflow-Run-Completed-At'] = completedAt?.toISOString() || '';
+
return returnValue instanceof ReadableStream
- ? new Response(returnValue, {
- headers: {
- 'Content-Type': 'application/octet-stream',
- },
- })
- : Response.json(returnValue);
+ ? new Response(returnValue, { headers })
+ : Response.json(returnValue, { headers });
} catch (error) {
if (error instanceof Error) {
if (WorkflowRunNotCompletedError.is(error)) {
Analysis
Missing workflow timing metadata headers in nextjs-webpack trigger endpoint
What fails: The GET handler in workbench/nextjs-webpack/app/api/trigger/route.ts returns workflow results without the X-Workflow-Run-Created-At, X-Workflow-Run-Started-At, and X-Workflow-Run-Completed-At headers required by the benchmarking system, causing executionTimeMs calculations to fail with null/undefined timing values.
How to reproduce:
- Run benchmarks using the nextjs-webpack workbench:
DEPLOYMENT_URL=http://localhost:3000 npm run bench - The benchmark code extracts timing headers from the GET response at
packages/core/e2e/bench.bench.tslines 79-85 - These headers are missing from nextjs-webpack but present in nextjs-turbopack
Result: The benchmark records timing data with all header values as null:
{
"createdAt": null,
"startedAt": null,
"completedAt": null,
"executionTimeMs": undefined
}Expected: Response should include timing metadata headers like the nextjs-turbopack implementation does, allowing the benchmark code (lines 163-167 of bench.bench.ts) to calculate executionTimeMs = completedAt.getTime() - createdAt.getTime()
Fix implemented: Added the missing header-extraction and response-building code to match the nextjs-turbopack implementation, extracting run.createdAt, run.startedAt, and run.completedAt and including them in the response headers with ISO string formatting.
6224574 to
1af51b7
Compare
📊 Benchmark ComparisonCross-matrix comparison of workflow performance across frameworks and backends. workflow with no steps
workflow with 1 step
workflow with 10 sequential steps
workflow with 10 parallel steps
Summary: Fastest Framework by Backend
Summary: Fastest Backend by Framework
Column Definitions
Backends:
|
1af51b7 to
a9858e9
Compare

Add some simple vitest based performance benchmarking for local world, postgres world, and vercel world, along with a github action to track it
Every PR will now run benchmarks across a few frameworks and all worlds. Once complete, a summary is posted with the results: https://github.com/vercel/workflow/actions/runs/19782846073/attempts/1#summary-56685707991