Skip to content

Conversation

@pull
Copy link

@pull pull bot commented Jan 30, 2026

See Commits and Changes for more details.


Created by pull[bot] (v2.0.0-alpha.4)

Can you help keep this open source service alive? 💖 Please sponsor : )

sokra and others added 13 commits January 30, 2026 15:32
### What?

Stop rules when module type is set.
Apply ecmascript transforms before module type is set. (no more errors when trying to apply a transform on non-ecmascript type)
Fix ordering of transform apply.
Reorder module rules to match previous order. They are applied in order now.
During prerendering, track which route params each segment accesses.
Params that are NOT accessed can be omitted from the client router's
cache key for that segment, allowing it to be shared across multiple
pages and reducing the number of prefetch requests.

For example, if a page at /shop/[category]/[itemId] only accesses
`category` but not `itemId`, navigating between /shop/electronics/phone
and /shop/electronics/tablet can reuse the same cached page segment,
since only `category` affects its output.

Most of the client-side changes were already implemented in previous
PRs; the core mechanism was already being used for omitting search
params from the cache keys of static segments. This generalizes the
mechanism for all params. So this PR is primarily about adding per-param
tracking on the server and wiring up the logic to send the params to the
client.

Each segment's CacheNodeSeedData contains its own VaryParamsThenable
that resolves to a Set<string> of accessed param names. The thenable is
a mutable tracker during render that accumulates param accesses, then
resolves when rendering completes. The response also includes an `h`
field for head vary params, which tracks params accessed by
generateMetadata/generateViewport separately from segment body access.

The trickiest part of this implementation is the timing. The vary params
thenables represent metadata about the response, but they're also part
of the response itself - they're streamed alongside the segment data. We
can't know which params were accessed until rendering completes, but we
need to resolve all thenables before aborting the stream, or else the
client would block waiting for data that will never arrive.

We address this on both sides. On the server, we resolve all thenables
immediately before aborting. On the client, we read vary params
synchronously using a React Flight optimization: calling
thenable.then(noop) forces Flight to transition from 'resolved_model' to
'fulfilled' without scheduling a microtask. If the thenable still isn't
fulfilled after this, we fall back to null (unknown) rather than
blocking. This ensures the client never suspends on these thenables,
providing a safe fallback if something goes wrong with server timing.

A key distinction is null vs empty Set. An empty Set means the segment
accesses no params and can be shared across all param values - this is
the case for client components (when Cache Components is enabled),
segments without user code, etc. A null value means tracking failed -
either because it's not wired up yet (like runtime prefetches), or due
to some edge case where the thenables weren't resolved in time.

For segments where we know upfront that no params will be accessed, we
use a singleton emptyVaryParamsTracker that's already resolved. This
ensures these segments resolve correctly even if other tracking fails.

In the future, we'll likely optimize the response format by sending a
bitmask instead of a Set of param names. But this is mostly just an
optimization — sending the Set doesn't expose anything new since the
param names are already embedded in the route tree (although that's also
something we could obfuscate in the future).

This does not yet implement vary param tracking during runtime
prefetches - the abort timing needs additional coordination. Deferred to
a separate PR.
Closes PACK-6537

When a deployment id is available, don't put the data routes at `_next/data/<BUILDID>/page.json`, but at `_next/data/page.json`.
Deployment skew (forcing a MPA nav instead of a SPA nav if a deployment happened inbetween) is handled by comparing a header on the client side.


If you use `output:export` and set `config.deploymentId` (i.e. enable skew protection) you are still expected to set these headers.

---

- [x] Find the right place where the header should be added for prerendered static RSC responses (route manifest)
This updates to remove extra tracing that was being done in
`build-complete` for deployment adapters to allow importing internal
`node-environment` modules. Instead this exposes `next/setup-node-env`
to adapters outputs that can be used instead and are traced through our
normal bundling/tracing flow.
### What?

* Convert it to a markdown table
* Show counts for all meta items
* Show size of output
…module (#88788)

### What?

This PR adds support for the `type` option in Turbopack rules, enabling users to set the module type directly without requiring a custom loader. This mirrors webpack's [`type`](https://webpack.js.org/configuration/module/#ruletype) option (e.g., `type: 'asset/resource'`).

Users can now configure how files are processed by setting the module type directly:

```js
// next.config.js
module.exports = {
  turbopack: {
    rules: {
      '*.svg': {
        type: 'asset',
      },
    },
  },
}
```

When using type: 'asset', importing the file returns its URL:

```
import svgUrl from './icon.svg'

export default function Page() {
  return <img src={svgUrl} alt="Icon" />
}
```

More types are available. See docs
## Summary

- Adds `experimental.varyParams` feature flag (defaults to `false`)
- When disabled, the client skips vary params rekeying and treats params
as unknown
- Allows internal rollout with easy revert if regressions occur during
dogfooding

The flag only gates the client-side rekeying logic in `cache.ts`.
Server-side param tracking continues regardless.

## Test plan

- Enabled the flag in the vary-params and optimistic-routing test
fixtures
- What?
Fixes zlib memory leak in Node.js 24 when requests are aborted.

- Why?
Compression streams weren't cleaned up on abort, causing memory to
accumulate.

- How?
Destroy the response on request abort/close to clean up compression
streams.

Closes NEXT-
fixes: #89091
@pull pull bot locked and limited conversation to collaborators Jan 30, 2026
@pull pull bot added the ⤵️ pull label Jan 30, 2026
@pull pull bot merged commit abfd994 into code:canary Jan 30, 2026
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants