Skip to content

[Segment Cache] Always upsert on prefetch completion#91488

Merged
acdlite merged 1 commit intovercel:canaryfrom
acdlite:simplify-segment-rekeying
Mar 17, 2026
Merged

[Segment Cache] Always upsert on prefetch completion#91488
acdlite merged 1 commit intovercel:canaryfrom
acdlite:simplify-segment-rekeying

Conversation

@acdlite
Copy link
Contributor

@acdlite acdlite commented Mar 17, 2026

Previous:

  1. Buffer prefetch response before passing to Flight client #91487

Current:

  1. [Segment Cache] Always upsert on prefetch completion #91488

Up next:

  1. Track vary params during runtime prefetches #89297

When a prefetch response includes vary params, the segment cache rekeys entries to a more generic path based on which params the segment actually depends on. Previously, the rekeying only happened when vary params were provided. Now that vary params are tracked for more response types (and eventually will always be tracked), entries are rekeyed in more cases than before.

This exposed a potential race condition: the scheduler would capture a vary path at scheduling time and upsert the entry at that path when the fetch completed. But the fetch functions themselves rekey entries to a different (more generic) path upon fulfillment. The deferred upsert could then move the entry back to the less generic path, undoing the rekeying.

To fix this, move the upsert logic inline into the fetch functions that fulfill entries, rather than deferring it to an external callback. This removes the race condition, simplifies the model, and reduces implementation complexity. The previous structure existed to avoid the rekeying cost when vary params weren't available, but rekeying is inexpensive and not worth the added indirection.

The upsert function itself already handles concurrent writes by comparing fetch strategies and checking whether the new entry provides more complete data than any existing entry. So it's safe to always call it inline — whichever entry wins will be the most complete one.

@acdlite acdlite changed the base branch from canary to buffer-prefetch-stream March 17, 2026 03:03
@acdlite acdlite changed the title [segment cache] Always upsert on prefetch completion [SC] Always upsert on prefetch completion Mar 17, 2026
@acdlite acdlite changed the title [SC] Always upsert on prefetch completion [Segment Cache] Always upsert on prefetch completion Mar 17, 2026
@acdlite acdlite marked this pull request as ready for review March 17, 2026 03:06
@acdlite acdlite force-pushed the simplify-segment-rekeying branch from 2dc79b1 to 21ad401 Compare March 17, 2026 03:06
@acdlite acdlite force-pushed the simplify-segment-rekeying branch from 21ad401 to 010c419 Compare March 17, 2026 03:27
@nextjs-bot
Copy link
Collaborator

nextjs-bot commented Mar 17, 2026

Failing test suites

Commit: 0de21b5 | About building and testing Next.js

pnpm test-dev test/development/app-dir/instant-navs-devtools/instant-navs-devtools.test.ts (job)

  • instant-nav-panel > should show client nav state after clicking Start and navigating (DD)
  • instant-nav-panel > should show loading skeleton during SPA navigation after clicking Start (DD)
Expand output

● instant-nav-panel › should show client nav state after clicking Start and navigating

expect(received).toContain(expected) // indexOf

Expected substring: "Continue rendering"
Received string:    "Client navigation·
Click any link in your app to view the prefetched UI for that page."

  127 |       expect(text).toContain('Client navigation')
  128 |       expect(text).toContain('prefetched UI')
> 129 |       expect(text).toContain('Continue rendering')
      |                    ^
  130 |     })
  131 |
  132 |     // Clean up

  at toContain (development/app-dir/instant-navs-devtools/instant-navs-devtools.test.ts:129:20)
  at retry (lib/next-test-utils.ts:861:14)
  at Object.<anonymous> (development/app-dir/instant-navs-devtools/instant-navs-devtools.test.ts:125:5)

● instant-nav-panel › should show loading skeleton during SPA navigation after clicking Start

expect(received).toBe(expected) // Object.is equality

Expected: true
Received: false

  161 |           '[data-testid="dynamic-skeleton"]'
  162 |         )
> 163 |         expect(skeleton).toBe(true)
      |                          ^
  164 |       },
  165 |       30000,
  166 |       500

  at toBe (development/app-dir/instant-navs-devtools/instant-navs-devtools.test.ts:163:26)
  at retry (lib/next-test-utils.ts:861:14)
  at Object.<anonymous> (development/app-dir/instant-navs-devtools/instant-navs-devtools.test.ts:158:5)

pnpm test-dev test/development/pages-dir/client-navigation/url-hash.test.ts (job)

  • Client navigation with URL hash > when hash changes with state > when passing state via hash change > should increment the shallow history state counter (DD)
Expand output

● Client navigation with URL hash › when hash changes with state › when passing state via hash change › should increment the shallow history state counter

expect(received).toBe(expected) // Object.is equality

Expected: "SHALLOW HISTORY COUNT: 2"
Received: "SHALLOW HISTORY COUNT: 1"

  253 |           .text()
  254 |
> 255 |         expect(historyCount).toBe('SHALLOW HISTORY COUNT: 2')
      |                              ^
  256 |
  257 |         const counter = await browser.elementByCss('p').text()
  258 |

  at Object.toBe (development/pages-dir/client-navigation/url-hash.test.ts:255:30)

@acdlite acdlite requested a review from unstubbable March 17, 2026 04:53
// is set when the entry is fulfilled with data from the server response.
const staleAt = now + 30 * 1000
const emptyEntry: EmptySegmentCacheEntry = {
// @ts-ignore debug
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

leftover?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes good catch

@acdlite acdlite force-pushed the buffer-prefetch-stream branch from 90ab2a8 to 2458d4c Compare March 17, 2026 13:29
@acdlite acdlite force-pushed the simplify-segment-rekeying branch from 010c419 to a1c7c7f Compare March 17, 2026 13:30
acdlite added a commit that referenced this pull request Mar 17, 2026
**Current:**

1. #91487

**Up next:**

2. #91488
3. #89297

---

Prefetch responses include metadata (in the Flight stream sense, not
HTML document metadata) that describes properties of the overall
response — things like the stale time and the set of params that were
accessed during rendering. Conceptually these are like late HTTP
headers: information that's only known once the response is complete.
Since we can't rely on actual HTTP late headers being supported
everywhere, we encode this metadata in the body of the Flight response.

The mechanism works by including an unresolved thenable in the Flight
payload, then resolving it just before closing the stream. On the
client, after the stream is fully received, we unwrap the thenable
synchronously. This synchronous unwrap relies on the assumption that the
server resolved the thenable before closing the stream.

The server already buffers prefetch responses before sending them, so
the resolved thenable data is always present in the response. However,
HTTP chunking in the browser layer can introduce taskiness when
processing the response, which could prevent Flight from decoding the
full payload synchronously. The existing code includes fallback behavior
for this case (e.g. treating the vary params as unknown), so this
doesn't fix a semantic issue — it strengthens the guarantee so that the
fallback path is never reached.

To do this, we buffer the full response on the client and concatenate it
into a single chunk before passing it to Flight. A single chunk is
necessary because Flight's `processBinaryChunk` processes all rows
synchronously within one call. Multiple chunks would not be sufficient
even if pre-enqueued: the `await` continuation from
`createFromReadableStream` can interleave between chunks, causing
promise value rows to be processed after the root model initializes,
which leaves thenables in a pending state. Since the server already
buffers these responses and they complete during a prefetch (not during
a navigation), this is not a performance consideration.

Full (dynamic) prefetches are not affected by this change. These are
streaming responses — even though they are cached, they are a special
case where dynamic data is treated as if it were cached. They don't need
to be buffered on either the server or the client the way normal cached
responses are.
When a prefetch response includes vary params, the segment cache rekeys
entries to a more generic path based on which params the segment
actually depends on. Previously, the rekeying only happened when vary
params were provided. Now that vary params are tracked for more response
types (and eventually will always be tracked), entries are rekeyed in
more cases than before.

This exposed a potential race condition: the scheduler would capture a
vary path at scheduling time and upsert the entry at that path when the
fetch completed. But the fetch functions themselves rekey entries to a
different (more generic) path upon fulfillment. The deferred upsert
could then move the entry back to the less generic path, undoing the
rekeying.

To fix this, move the upsert logic inline into the fetch functions that
fulfill entries, rather than deferring it to an external callback. This
removes the race condition, simplifies the model, and reduces
implementation complexity. The previous structure existed to avoid the
rekeying cost when vary params weren't available, but rekeying is
inexpensive and not worth the added indirection.

The upsert function itself already handles concurrent writes by
comparing fetch strategies and checking whether the new entry provides
more complete data than any existing entry. So it's safe to always call
it inline — whichever entry wins will be the most complete one.
@acdlite acdlite force-pushed the simplify-segment-rekeying branch from a1c7c7f to 0de21b5 Compare March 17, 2026 17:56
@nextjs-bot nextjs-bot added create-next-app Related to our CLI tool for quickly starting a new Next.js application. Documentation Related to Next.js' official documentation. tests Turbopack Related to Turbopack with Next.js. labels Mar 17, 2026
@acdlite acdlite changed the base branch from buffer-prefetch-stream to canary March 17, 2026 17:57
@acdlite acdlite merged commit e16f6fb into vercel:canary Mar 17, 2026
27 of 29 checks passed
acdlite added a commit that referenced this pull request Mar 17, 2026
**Previous:**

1. #91487
2. #91488

**Current:**

3. #89297

---

Most of the vary params infrastructure was already implemented in
previous PRs for static prerenders. This wires up the remaining pieces
for runtime prefetches — creating the accumulator, setting it on the
prerender store, and resolving it before abort — and adds additional
test cases covering empty/full vary sets, searchParams, metadata, and
per-segment layout/page splits with runtime prefetching.
@codspeed-hq
Copy link

codspeed-hq bot commented Mar 17, 2026

Merging this PR will not alter performance

✅ 17 untouched benchmarks
⏩ 3 skipped benchmarks1


Comparing acdlite:simplify-segment-rekeying (0de21b5) with canary (a46ff7c)2

Open in CodSpeed

Footnotes

  1. 3 benchmarks were skipped, so the baseline results were used instead. If they were deleted from the codebase, click here and archive them to remove them from the performance reports.

  2. No successful run was found on canary (d50bef0) during the generation of this report, so a46ff7c was used instead as the comparison base. There might be some changes unrelated to this pull request in this report.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

create-next-app Related to our CLI tool for quickly starting a new Next.js application. created-by: Next.js team PRs by the Next.js team. Documentation Related to Next.js' official documentation. tests Turbopack Related to Turbopack with Next.js. type: next

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants