Skip to content

feat(pack): harden server bundling and simplify configuration#2808

Merged
xusd320 merged 15 commits intonextfrom
feat/server-bundling-hardening
Apr 20, 2026
Merged

feat(pack): harden server bundling and simplify configuration#2808
xusd320 merged 15 commits intonextfrom
feat/server-bundling-hardening

Conversation

@xusd320
Copy link
Copy Markdown
Contributor

@xusd320 xusd320 commented Apr 18, 2026

This PR hardens the RSC-like server bundling process and simplifies the configuration of server functions.

Key changes:

  • Support for optional server entries.
  • Unified server chunking: server entry and server functions are now bundled into a single index.js chunk.
  • Configuration simplification: flattened clientReference and serverReference directly into the server configuration block.
  • Improved path resolution: users can now use project-relative paths (e.g., ./input/transport.ts) for client and server references, which are handled via internal bundler aliases.
  • Refactored AST transformer to use fixed internal specifiers for server references.
  • Prepares the framework to properly support the examples/with-server-references workspace.
  • Updated snapshot tests to reflect these changes.
  • Fixed clippy warnings in pack-api/src/app.rs.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the server function implementation into a more general "server reference" system. It updates the configuration schema to support explicit server entry points and custom modules for client/server reference handling. Key changes include deterministic sorting of server modules, the addition of server-side registration logic, and the ability to bundle a standalone server entry. Feedback focuses on ensuring the server bundle is generated when an entry point is provided even without server functions, resolving inconsistencies in default export handling, and respecting configured output filenames.

Comment thread crates/pack-api/src/app.rs Outdated
// Build server functions as Node.js if configured
let server_config = this.project.config().server().await?;
let output_assets = if server_config.functions.is_some() {
let output_assets = if server_config.client_reference.is_some() {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The condition for building server assets is too restrictive. If a user provides a server entry but no server functions (and thus no client_reference), the server bundle will not be generated at all. The condition should include server_config.entry.is_some() to ensure the server entry point is processed even when no server functions are present.

Suggested change
let output_assets = if server_config.client_reference.is_some() {
let output_assets = if server_config.client_reference.is_some() || server_config.entry.is_some() {

Comment thread crates/pack-api/src/app.rs Outdated
Comment thread crates/pack-core/src/server_reference/proxy.rs Outdated
Comment thread crates/pack-api/src/app.rs Outdated
xusd320 added 2 commits April 19, 2026 01:26
- Ensure server bundle builds when entry is provided even without server functions
- Proceed with server entry resolution even if unique_server_modules is empty
- Allow default exports during server reference registration
- Respect server.output.filename configuration for server chunk name
@xusd320
Copy link
Copy Markdown
Contributor Author

xusd320 commented Apr 19, 2026

@gemini-code-assist All feedback points addressed:

  1. app.rs:493 - Expanded conditional logic so the server bundle executes if client_reference OR entry is defined.
  2. app.rs:605 - Removed the early returns on empty unique_server_modules, ensuring resolution of the configured server entry even if no actual server functions exist in the client graph.
  3. proxy.rs:292 - Discarded the manual default block list during server reference AST registration, enabling consistent support for default exports.
  4. app.rs:682 - The server chunk resolution now strictly obeys the provided server.output.filename schema, resolving gracefully to index.js as a fallback.

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Thank you for the update, @xusd320. The changes to the server bundling logic and the configuration simplification look solid and well-aligned with the requirements for supporting RSC-like server references. I appreciate the attention to detail regarding the server.output.filename schema and the deterministic sorting for AST chunk hashes. I will proceed with the final review of these changes.

…ions

- Extract flat clientReference and serverReference into a server.function namespace.
- Use explicit identifiers: clientProxy and serverRegister to better annotate the RPC networking role.
- Adds TODO mapping block for server.component RSC schema
@xusd320 xusd320 force-pushed the feat/server-bundling-hardening branch from 566669e to 628ca93 Compare April 19, 2026 10:17
xusd320 added 10 commits April 19, 2026 18:23
…Reference

- Align example with expected server reference ast proxy module requirements
…modules

- Document expected TS signatures for createServerReference inside clientProxy
- Document expected TS signatures for registerServerReference inside serverRegister
- Generalize proxy module documentation to reflect the actual utoo generated AST logic
- Add missing `entry` field to server config typings
- Update `functions.callServerModule` to `function` with `clientProxy` and `serverRegister` typings
Resolves issue where Turbopack module scope bundler renamed target server functions, mismatching the injected registerServerReference call, by cloning the original Ident instead of generating an uncontextualized identifier.
Verifies that multiple modules exporting the same server function name natively avoid collisions by receiving unique hashes and preserving localized scopes without `registerServerReference` conflicting.
@xusd320 xusd320 enabled auto-merge (squash) April 20, 2026 06:55
@xusd320 xusd320 merged commit 13d71a0 into next Apr 20, 2026
23 of 24 checks passed
@xusd320 xusd320 deleted the feat/server-bundling-hardening branch April 20, 2026 07:07
@github-actions
Copy link
Copy Markdown

📊 Performance Benchmark Report (with-antd)

Utoopack Performance Report

Report ID: utoopack_performance_report_20260420_071850
Generated: 2026-04-20 07:18:50
Trace File: trace_antd.json (0.6GB, 1.61M spans)
Test Project: examples/with-antd


Executive Summary

Metric Value Assessment
Total Wall Time 13,076.1 ms Baseline
Total Thread Work (de-duped) 27,585.0 ms Non-overlapping busy time
Effective Parallelism 2.1x thread_work / wall_time
Working Threads 5 Threads with actual spans
Thread Utilization 42.2% ⚠️ Suboptimal
Total Spans 1,610,003 All B/E + X events
Meaningful Spans (>= 10us) 517,518 (32.1% of total)
Tracing Noise (< 10us) 1,092,485 (67.9% of total)

Build Phase Timeline

Shows when each build phase is active and how much CPU it consumes.
Self-Time is the time spent exclusively in that phase (excluding children).

Phase Spans Inclusive (ms) Self-Time (ms) Wall Range (ms)
Resolve 124,877 3,366.1 2,687.3 8,104.8
Parse 12,057 1,606.8 1,360.0 12,934.7
Analyze 297,667 16,426.1 11,828.0 12,466.8
Chunk 30,251 2,681.0 2,492.9 7,659.7
Codegen 40,708 4,804.3 3,587.3 7,756.0
Emit 75 64.6 30.9 8,191.0
Other 11,883 1,582.9 1,413.9 13,076.1

Workload Distribution by Diagnostic Tier

Category Spans Inclusive (ms) % Work Self-Time (ms) % Self
P0: Scheduling & Resolution 431,024 20,862.5 75.6% 15,434.5 56.0%
P1: I/O & Heavy Tasks 3,281 139.7 0.5% 106.0 0.4%
P2: Architecture (Locks/Memory) 0 0.0 0.0% 0.0 0.0%
P3: Asset Pipeline 81,524 9,119.4 33.1% 7,467.6 27.1%
P4: Bridge/Interop 0 0.0 0.0% 0.0 0.0%
Other 1,689 410.0 1.5% 392.2 1.4%

Top 20 Tasks by Self-Time

Self-time is the exclusive duration: time spent in the task itself, not in sub-tasks.
This is the most accurate indicator of where CPU cycles are actually spent.

Self (ms) Inclusive (ms) Count Avg Self (us) P95 Self (ms) Max Self (ms) % Work Task Name Top Caller
6,389.4 7,526.9 167,989 38.0 0.1 26.0 23.2% module write all entrypoints to disk (1%)
2,670.3 3,594.5 39,289 68.0 0.1 176.0 9.7% analyze ecmascript module process module (76%)
1,723.6 1,882.1 16,759 102.8 0.2 86.9 6.2% chunking write all entrypoints to disk (0%)
1,691.3 2,908.2 22,661 74.6 0.3 43.1 6.1% code generation chunking (8%)
1,592.1 1,592.1 15,965 99.7 0.4 12.7 5.8% precompute code generation code generation (45%)
1,529.9 1,652.8 62,835 24.3 0.0 8.0 5.5% internal resolving resolving (30%)
1,420.1 3,925.8 72,688 19.5 0.0 6.6 5.1% process module module (14%)
1,180.0 1,180.0 14,642 80.6 0.3 121.5 4.3% compute async module info chunking (0%)
1,147.2 1,703.1 61,352 18.7 0.0 5.0 4.2% resolving module (34%)
1,076.8 1,134.8 8,744 123.1 0.5 59.0 3.9% parse ecmascript analyze ecmascript module (26%)
989.0 1,136.3 9,319 106.1 0.0 254.9 3.6% write all entrypoints to disk None (0%)
741.8 742.3 13,325 55.7 0.0 57.6 2.7% compute async chunks compute async chunks (0%)
348.9 359.1 1,445 241.5 0.9 18.2 1.3% webpack loader parse css (9%)
304.0 304.0 2,082 146.0 0.4 16.6 1.1% generate source map code generation (96%)
218.2 407.0 797 273.7 1.2 22.2 0.8% parse css module (6%)
92.3 92.3 1,024 90.2 0.0 19.6 0.3% compute binding usage info write all entrypoints to disk (0%)
62.4 62.4 2,503 24.9 0.0 2.0 0.2% read file parse ecmascript (85%)
61.1 61.1 2,011 30.4 0.0 11.0 0.2% collect mergeable modules compute merged modules (0%)
32.7 36.6 875 37.4 0.1 3.1 0.1% async reference write all entrypoints to disk (0%)
30.3 30.3 36 842.5 4.3 10.4 0.1% write file apply effects (100%)

Critical Path Analysis

The longest sequential dependency chains that determine wall-clock time.
Focus on reducing the depth of these chains to improve parallelism.

Rank Self-Time (ms) Depth Path
1 176.1 2 process module → analyze ecmascript module
2 59.6 2 code generation → generate source map
3 59.0 2 analyze ecmascript module → parse ecmascript
4 53.3 2 code generation → generate source map
5 49.3 2 process module → analyze ecmascript module

Batching Candidates

High-volume tasks dominated by a single parent. If the parent can batch them,
it drastically reduces scheduler overhead.

Task Name Count Top Caller (Attribution) Avg Self P95 Self Total Self
analyze ecmascript module 39,289 process module (76%) 68.0 us 0.15 ms 2,670.3 ms

Duration Distribution

Range Count Percentage
<10us 1,092,485 67.9%
10us-100us 488,099 30.3%
100us-1ms 25,022 1.6%
1ms-10ms 4,272 0.3%
10ms-100ms 120 0.0%
>100ms 5 0.0%

Action Items

  1. [P0] Focus on tasks with the highest Self-Time — these are where CPU cycles are actually spent.
  2. [P0] Use Batching Candidates to identify callers that should use try_join or reduce #[turbo_tasks::function] granularity.
  3. [P1] Check Build Phase Timeline for phases with disproportionate wall range vs. self-time (= serialization).
  4. [P1] Inspect P95 Self (ms) for heavy monolith tasks. Focus on long-tail outliers, not averages.
  5. [P1] Review Critical Paths — reducing the longest chain depth directly improves wall-clock time.
  6. [P2] If Thread Utilization < 60%, investigate scheduling gaps (lock contention or deep dependency chains).

Report generated by Utoopack Performance Analysis Agent

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants