Skip to content

Improve deduping of concurrent 'use cache' invocations#91830

Merged
unstubbable merged 2 commits intocanaryfrom
hl/use-cache-deduping-take-two
Apr 17, 2026
Merged

Improve deduping of concurrent 'use cache' invocations#91830
unstubbable merged 2 commits intocanaryfrom
hl/use-cache-deduping-take-two

Conversation

@unstubbable
Copy link
Copy Markdown
Contributor

@unstubbable unstubbable commented Mar 23, 2026

With #71286, we implemented deduping of cache entries under certain circumstances. For example, the following constructed example was fixed with the PR:

async function getCachedRandom() {
  'use cache'
  return Math.random()
}

const rand1 = await getCachedRandom()
const rand2 = await getCachedRandom()

assert(rand1 === rand2)

However, this implementation relied on awaiting the two calls sequentially. When rendering components, this can usually not be guaranteed. E.g. the following example was not properly deduped:

async function Cached() {
  'use cache'
  return <p>{Math.random()}</p>
}

export default function Page() {
  return (
    <>
      <Cached />
      <Cached />
    </>
  )
}

This did render the same value, but only because we triggered two render passes, and the last cached value was used for both elements in the final render pass. But during the first render pass, the Cached function was called twice, and the cache entry was also set twice.

With #75786, we also fixed the render scenario by wrapping the cached function in React.cache. This however did not work for route handlers. E.g. the first example rewritten as follows, and used in a route handler, still wouldn't be deduped:

const [rand1, rand2] = await Promise.all([
  getCachedRandom(),
  getCachedRandom(),
])

Furthermore, with this solution, nested cached functions could not be deduped across different outer cache scopes. This is because each cache scope creates its own React.cache scope. Example:

async function Inner() {
  'use cache'
  return <p>{Math.random()}</p>
}

async function Outer1() {
  'use cache'
  return <Inner />
}

async function Outer2() {
  'use cache'
  return <Inner />
}

export default function Page() {
  return (
    <>
      <Outer1 />
      <Outer2 />
    </>
  )
}

This PR introduces two-layer invocation tracking that deduplicates these cases without changing how cache handlers are implemented. Only the first invocation (the "leader") performs the cache handler lookup and generation. Subsequent invocations ("joiners") tee the leader's result stream instead.

The deduplication scope starts after the Resume Data Cache (RDC) lookup and before the cache handler get. The RDC phase is excluded because it throws synchronous errors (dynamic usage errors) that need individual stack traces per call site, and RDC lookups are local with no network savings from deduplication.

Intra-request deduplication is stored on the WorkStore and keyed by serializedCacheKey (the coarse key, which is safe because root params are identical within a single request). Cross-request deduplication is stored in a module-scope map keyed by cacheHandlerKey, which may include root params on the warm path. Cross-request joiners must await metadata for root param verification before forking the stream. If the key mismatches (different root params), the joiner retries with a recomputed key. Because metadata is checked before fork() is called, a mismatched joiner never consumes a stream from the wrong entry.

Stream tee-ing is lazy via the SharedCacheEntry class: the leader calls fork() to get its copy, and each joiner calls fork() on demand. If no joiners exist, only one tee occurs. The SharedCacheResult discriminated union wraps either a SharedCacheEntry (for the cached case) or a hanging promise (for prerender-dynamic).

Each invocation still decodes the stream independently via createFromReadableStream with its own temporaryReferences, because sharing the decoded result would cause cache poisoning when components receive different non-serializable props (e.g. children).

Admittedly, this adds significant complexity to the cache() function. Two classes help keep it manageable:

  • SharedCacheEntry encapsulates stream ownership and lazy tee-ing so that call sites don't need to reason about which streams have been consumed or need cloning.
  • ResolvableSharedCacheResult manages the deferred promise, map registration, and lazy cleanup. Entries stay in the dedup maps until collection completes (so late-arriving invocations can join while the leader streams), then clean up automatically on resolve or reject.

A follow-up refactoring to extract the cache handler lookup and generation into a separate function would help break up the function's size.

closes #78703

@nextjs-bot
Copy link
Copy Markdown
Collaborator

nextjs-bot commented Mar 23, 2026

Tests Passed

@unstubbable unstubbable force-pushed the hl/use-cache-deduping-take-two branch from 48c94c3 to ab1f2a2 Compare March 23, 2026 15:59
Comment thread packages/next/src/server/use-cache/use-cache-wrapper.ts Outdated
@nextjs-bot
Copy link
Copy Markdown
Collaborator

nextjs-bot commented Mar 23, 2026

Stats from current PR

🔴 1 regression

Metric Canary PR Change Trend
node_modules Size 494 MB 494 MB 🔴 +176 kB (+0%) ████▅
📊 All Metrics
📖 Metrics Glossary

Dev Server Metrics:

  • Listen = TCP port starts accepting connections
  • First Request = HTTP server returns successful response
  • Cold = Fresh build (no cache)
  • Warm = With cached build artifacts

Build Metrics:

  • Fresh = Clean build (no .next directory)
  • Cached = With existing .next directory

Change Thresholds:

  • Time: Changes < 50ms AND < 10%, OR < 2% are insignificant
  • Size: Changes < 1KB AND < 1% are insignificant
  • All other changes are flagged to catch regressions

⚡ Dev Server

Metric Canary PR Change Trend
Cold (Listen) 457ms 456ms ▁▁█▁█
Cold (Ready in log) 443ms 443ms ▆▅▁▆▅
Cold (First Request) 831ms 835ms ▁█▇▁▁
Warm (Listen) 456ms 456ms ▅▅█▁▅
Warm (Ready in log) 442ms 442ms ▂█▅▄▁
Warm (First Request) 345ms 341ms ▅█▄▆▃
📦 Dev Server (Webpack) (Legacy)

📦 Dev Server (Webpack)

Metric Canary PR Change Trend
Cold (Listen) 455ms 456ms █▁▁▁█
Cold (Ready in log) 442ms 441ms █▇▃▅▆
Cold (First Request) 1.969s 1.967s ▆▇▅▁▁
Warm (Listen) 457ms 456ms ██▅▅▅
Warm (Ready in log) 442ms 442ms █▇▄▇▄
Warm (First Request) 1.959s 1.976s ██▄▁▁

⚡ Production Builds

Metric Canary PR Change Trend
Fresh Build 4.028s 4.081s █▃▁▇▆
Cached Build 4.111s 4.089s ▇▃▁▇▅
📦 Production Builds (Webpack) (Legacy)

📦 Production Builds (Webpack)

Metric Canary PR Change Trend
Fresh Build 14.785s 14.755s ▇█▅▇▅
Cached Build 14.768s 14.816s ▆█▄█▇
node_modules Size 494 MB 494 MB 🔴 +176 kB (+0%) ████▅
📦 Bundle Sizes

Bundle Sizes

⚡ Turbopack

Client

Main Bundles
Canary PR Change
01lykb9j9u19z.js gzip 155 B N/A -
037b-ksaecgir.js gzip 158 B N/A -
051awit1pzxy9.js gzip 156 B N/A -
07rxhp_1_g4mu.js gzip 13.1 kB N/A -
08avva-dy02e7.js gzip 10.4 kB N/A -
0b9mvru8gaawc.js gzip 156 B N/A -
0cz1d0mv5g_q7.js gzip 39.4 kB 39.4 kB
0dfverolgqlu_.js gzip 155 B N/A -
0fli3_wppnim5.js gzip 12.9 kB N/A -
0guupo6x26xoo.js gzip 70.8 kB N/A -
0k09jwjeb-tki.js gzip 13.8 kB N/A -
0kb7_ep3r1z0_.js gzip 10.1 kB N/A -
0kpuma6a8t2qh.js gzip 168 B N/A -
0kw8xgqdrilf6.js gzip 8.56 kB N/A -
0ojkk2e654xsc.js gzip 8.59 kB N/A -
0sbq9bkqvh45e.js gzip 152 B N/A -
0wxpyd8r-vipl.js gzip 1.47 kB N/A -
0xnfs20vs3ysc.js gzip 153 B N/A -
0xy2fhla48_rd.js gzip 9.24 kB N/A -
10wqsvi2mgfmi.js gzip 9.82 kB N/A -
16lhqjoqbznyg.js gzip 220 B 220 B
16vepdkipri3r.js gzip 8.51 kB N/A -
17n96uu6y1pxq.js gzip 8.6 kB N/A -
18y4_8-9or0mn.js gzip 8.51 kB N/A -
1elt1qium-r2m.css gzip 115 B 115 B
1gq145j3kps-h.js gzip 8.62 kB N/A -
1l5or8vq6a69s.js gzip 154 B N/A -
1nsh-mbn0e-se.js gzip 8.56 kB N/A -
1tsrrp1tdngti.js gzip 13.3 kB N/A -
1zf460s-ga2zh.js gzip 154 B N/A -
2__-e_ym8n788.js gzip 450 B N/A -
2-ivsrs9yb0b0.js gzip 156 B N/A -
22o6xd9_ywdu6.js gzip 233 B N/A -
26ui6d5bv607a.js gzip 49.3 kB N/A -
2kvj8yrfznmwx.js gzip 5.69 kB N/A -
2mlsou7_9la1i.js gzip 154 B N/A -
2p854ctj-qiki.js gzip 65.5 kB N/A -
2qv7m7xjnokgr.js gzip 8.58 kB N/A -
341itofhl0awt.js gzip 160 B N/A -
342ijzvrpe53h.js gzip 2.29 kB N/A -
44un3--wmqiyh.js gzip 7.61 kB N/A -
turbopack-04..qhq7.js gzip 4.17 kB N/A -
turbopack-0a..u7nt.js gzip 4.19 kB N/A -
turbopack-0b..u348.js gzip 4.19 kB N/A -
turbopack-0e..wy2i.js gzip 4.19 kB N/A -
turbopack-0l..dasq.js gzip 4.19 kB N/A -
turbopack-1-..xex1.js gzip 4.2 kB N/A -
turbopack-1d..whps.js gzip 4.19 kB N/A -
turbopack-1e..pdkr.js gzip 4.18 kB N/A -
turbopack-3_..jxf5.js gzip 4.19 kB N/A -
turbopack-33..vmhx.js gzip 4.19 kB N/A -
turbopack-36..ne43.js gzip 4.19 kB N/A -
turbopack-3s..yafr.js gzip 4.19 kB N/A -
turbopack-3y..cu3g.js gzip 4.19 kB N/A -
turbopack-41..gi52.js gzip 4.19 kB N/A -
06eqw0ze8c7k4.js gzip N/A 65.5 kB -
0arkbdqpxc37i.js gzip N/A 8.6 kB -
0bz-xifewa17d.js gzip N/A 8.63 kB -
0fbm505yboynb.js gzip N/A 49.3 kB -
0tvekitj587fh.js gzip N/A 8.51 kB -
0xz7kqe1wjdqh.js gzip N/A 169 B -
0yvk6-wi8e9wh.js gzip N/A 13.3 kB -
0z83a1om5rvtt.js gzip N/A 7.61 kB -
1-jqyfc89tixo.js gzip N/A 1.46 kB -
14t1kneseb8th.js gzip N/A 2.3 kB -
15sb1-dsqfk_j.js gzip N/A 8.59 kB -
1ab2xruymo-oj.js gzip N/A 449 B -
1hxb-q1ungqh_.js gzip N/A 70.8 kB -
1tu25qtsmfhar.js gzip N/A 9.82 kB -
1vein_gnv3mwr.js gzip N/A 8.56 kB -
1wzrm0xjjbzn5.js gzip N/A 10.1 kB -
1z3g0uaqtv9_3.js gzip N/A 8.56 kB -
2-e64t22r1kgw.js gzip N/A 153 B -
21uyslsd4odmk.js gzip N/A 158 B -
25a1yz7zua29z.js gzip N/A 13.8 kB -
27o93knux3hfn.js gzip N/A 156 B -
2bi5hx402juv-.js gzip N/A 8.58 kB -
2hy56297fog9u.js gzip N/A 8.52 kB -
2r7z7i6hgc457.js gzip N/A 155 B -
2u_rpxq3tzytl.js gzip N/A 233 B -
2upap--8h9cvf.js gzip N/A 157 B -
323ki47w-n3e9.js gzip N/A 157 B -
35a8pvb74ba9h.js gzip N/A 155 B -
368lim5wq0o0r.js gzip N/A 12.9 kB -
3asf9b6dh7q99.js gzip N/A 155 B -
3d2-cjrz3nqd1.js gzip N/A 161 B -
3drqjohogojbw.js gzip N/A 5.69 kB -
3g8l1m2-o-ewi.js gzip N/A 13.1 kB -
3hhvtftvowwye.js gzip N/A 156 B -
3jmkxsnxg0nrh.js gzip N/A 10.4 kB -
3q11pdkvyja5e.js gzip N/A 161 B -
3r03tqt-li-wg.js gzip N/A 158 B -
3wpp8nvyoj121.js gzip N/A 9.24 kB -
turbopack-03..7g4c.js gzip N/A 4.19 kB -
turbopack-0f..tm9h.js gzip N/A 4.19 kB -
turbopack-0i..hzz_.js gzip N/A 4.19 kB -
turbopack-0m..zm5y.js gzip N/A 4.2 kB -
turbopack-0x..34yg.js gzip N/A 4.19 kB -
turbopack-16..zbi5.js gzip N/A 4.19 kB -
turbopack-1h..u9e1.js gzip N/A 4.19 kB -
turbopack-28..n7f5.js gzip N/A 4.19 kB -
turbopack-2h..wq2d.js gzip N/A 4.19 kB -
turbopack-2n..pnni.js gzip N/A 4.19 kB -
turbopack-2u..gsgj.js gzip N/A 4.19 kB -
turbopack-2x..crbl.js gzip N/A 4.19 kB -
turbopack-36..2jpx.js gzip N/A 4.19 kB -
turbopack-3d..8xnw.js gzip N/A 4.17 kB -
Total 465 kB 465 kB ⚠️ +62 B

Server

Middleware
Canary PR Change
middleware-b..fest.js gzip 717 B 721 B
Total 717 B 721 B ⚠️ +4 B
Build Details
Build Manifests
Canary PR Change
_buildManifest.js gzip 432 B 434 B
Total 432 B 434 B ⚠️ +2 B

📦 Webpack

Client

Main Bundles
Canary PR Change
2637-HASH.js gzip 4.63 kB N/A -
7724.HASH.js gzip 169 B N/A -
8274-HASH.js gzip 61.4 kB N/A -
8817-HASH.js gzip 5.59 kB N/A -
c3500254-HASH.js gzip 62.8 kB N/A -
framework-HASH.js gzip 59.7 kB 59.7 kB
main-app-HASH.js gzip 255 B 255 B
main-HASH.js gzip 39.4 kB 39.4 kB
webpack-HASH.js gzip 1.68 kB 1.68 kB
5887-HASH.js gzip N/A 5.61 kB -
6522-HASH.js gzip N/A 60.8 kB -
6779-HASH.js gzip N/A 4.63 kB -
8854.HASH.js gzip N/A 169 B -
eab920f9-HASH.js gzip N/A 62.8 kB -
Total 236 kB 235 kB ✅ -644 B
Polyfills
Canary PR Change
polyfills-HASH.js gzip 39.4 kB 39.4 kB
Total 39.4 kB 39.4 kB
Pages
Canary PR Change
_app-HASH.js gzip 193 B 193 B
_error-HASH.js gzip 182 B 182 B
css-HASH.js gzip 333 B 334 B
dynamic-HASH.js gzip 1.81 kB 1.8 kB
edge-ssr-HASH.js gzip 255 B 255 B
head-HASH.js gzip 353 B 349 B 🟢 4 B (-1%)
hooks-HASH.js gzip 384 B 382 B
image-HASH.js gzip 581 B 581 B
index-HASH.js gzip 260 B 259 B
link-HASH.js gzip 2.51 kB 2.51 kB
routerDirect..HASH.js gzip 316 B 318 B
script-HASH.js gzip 386 B 386 B
withRouter-HASH.js gzip 313 B 314 B
1afbb74e6ecf..834.css gzip 106 B 106 B
Total 7.98 kB 7.97 kB ✅ -10 B

Server

Edge SSR
Canary PR Change
edge-ssr.js gzip 126 kB 126 kB
page.js gzip 273 kB 273 kB
Total 399 kB 399 kB ✅ -366 B
Middleware
Canary PR Change
middleware-b..fest.js gzip 618 B 617 B
middleware-r..fest.js gzip 156 B 156 B
middleware.js gzip 44.2 kB 44.5 kB
edge-runtime..pack.js gzip 842 B 842 B
Total 45.8 kB 46.1 kB ⚠️ +249 B
Build Details
Build Manifests
Canary PR Change
_buildManifest.js gzip 721 B 720 B
Total 721 B 720 B ✅ -1 B
Build Cache
Canary PR Change
0.pack gzip 4.38 MB 4.38 MB 🔴 +7 kB (+0%)
index.pack gzip 113 kB 112 kB
index.pack.old gzip 112 kB 115 kB 🔴 +2.93 kB (+3%)
Total 4.6 MB 4.61 MB ⚠️ +8.99 kB

🔄 Shared (bundler-independent)

Runtimes
Canary PR Change
app-page-exp...dev.js gzip 347 kB 347 kB
app-page-exp..prod.js gzip 192 kB 192 kB
app-page-tur...dev.js gzip 346 kB 346 kB
app-page-tur..prod.js gzip 192 kB 192 kB
app-page-tur...dev.js gzip 343 kB 343 kB
app-page-tur..prod.js gzip 190 kB 190 kB
app-page.run...dev.js gzip 343 kB 343 kB
app-page.run..prod.js gzip 190 kB 190 kB
app-route-ex...dev.js gzip 77 kB 77 kB
app-route-ex..prod.js gzip 52.5 kB 52.5 kB
app-route-tu...dev.js gzip 77.1 kB 77.1 kB
app-route-tu..prod.js gzip 52.6 kB 52.6 kB
app-route-tu...dev.js gzip 76.7 kB 76.7 kB
app-route-tu..prod.js gzip 52.3 kB 52.3 kB
app-route.ru...dev.js gzip 76.6 kB 76.6 kB
app-route.ru..prod.js gzip 52.3 kB 52.3 kB
dist_client_...dev.js gzip 324 B 324 B
dist_client_...dev.js gzip 326 B 326 B
dist_client_...dev.js gzip 318 B 318 B
dist_client_...dev.js gzip 317 B 317 B
pages-api-tu...dev.js gzip 43.9 kB 43.9 kB
pages-api-tu..prod.js gzip 33.5 kB 33.5 kB
pages-api.ru...dev.js gzip 43.9 kB 43.9 kB
pages-api.ru..prod.js gzip 33.5 kB 33.5 kB
pages-turbo....dev.js gzip 53.3 kB 53.3 kB
pages-turbo...prod.js gzip 39.1 kB 39.1 kB
pages.runtim...dev.js gzip 53.3 kB 53.3 kB
pages.runtim..prod.js gzip 39.1 kB 39.1 kB
server.runti..prod.js gzip 62.9 kB 62.9 kB
Total 3.06 MB 3.06 MB ✅ -3 B
📎 Tarball URL
https://vercel-packages.vercel.app/next/commits/631d82540d100510e00d4e596e7c1ea9f604c58b/next

@unstubbable unstubbable force-pushed the hl/use-cache-deduping-take-two branch 8 times, most recently from 577e315 to b50b94b Compare March 26, 2026 22:02
@unstubbable unstubbable changed the title Dedupe concurrent 'use cache' invocations Improve deduping of concurrent 'use cache' invocations Mar 28, 2026
@unstubbable unstubbable force-pushed the hl/use-cache-deduping-take-two branch 3 times, most recently from 80691d4 to cf63000 Compare March 30, 2026 14:36
@unstubbable unstubbable marked this pull request as ready for review March 31, 2026 17:16
@unstubbable unstubbable requested review from gnoff and lubieowoce March 31, 2026 17:16
@unstubbable unstubbable force-pushed the hl/use-cache-deduping-take-two branch from cf63000 to 5ddc140 Compare April 16, 2026 18:41
With #71286, we implemented deduping of cache entries under certain
circumstances. For example, the following constructed example was fixed
with the PR:

```js
async function getCachedRandom() {
  'use cache'
  return Math.random()
}

const rand1 = await getCachedRandom()
const rand2 = await getCachedRandom()

assert(rand1 === rand2)
```

However, this implementation relied on awaiting the two calls
sequentially. When rendering components, this can usually not be
guaranteed. E.g. the following example was not properly deduped:

```jsx
async function Cached() {
  'use cache'
  return <p>{Math.random()}</p>
}

export default function Page() {
  return (
    <>
      <Cached />
      <Cached />
    </>
  )
}
```

This did render the same value, but only because we triggered two render
passes, and the last cached value was used for both elements in the
final render pass. But during the first render pass, the `Cached`
function was called twice, and the cache entry was also set twice.

With #75786, we also fixed the render scenario by wrapping the cached
function in `React.cache`. This however did not work for route handlers.
E.g. the first example rewritten as follows, and used in a route
handler, still wouldn't be deduped:

```js
const [rand1, rand2] = await Promise.all([
  getCachedRandom(),
  getCachedRandom(),
])
```

Furthermore, with this solution, nested cached functions could not be
deduped across different outer cache scopes. This is because each cache
scope creates its own `React.cache` scope. Example:

```jsx
async function Inner() {
  'use cache'
  return <p>{Math.random()}</p>
}

async function Outer1() {
  'use cache'
  return <Inner />
}

async function Outer2() {
  'use cache'
  return <Inner />
}

export default function Page() {
  return (
    <>
      <Outer1 />
      <Outer2 />
    </>
  )
}
```

This PR introduces two-layer invocation tracking that deduplicates these
cases without changing how cache handlers are implemented. Only the
first invocation (the "leader") performs the cache handler lookup and
generation. Subsequent invocations ("joiners") tee the leader's result
stream instead.

The deduplication scope starts after the Resume Data Cache (RDC) lookup
and before the cache handler `get`. The RDC phase is excluded because it
throws synchronous errors (dynamic usage errors) that need individual
stack traces per call site, and RDC lookups are local with no network
savings from deduplication.

Intra-request deduplication is stored on the `WorkStore` and keyed by
`serializedCacheKey` (the coarse key, which is safe because root params
are identical within a single request). Cross-request deduplication is
stored in a module-scope map keyed by `cacheHandlerKey`, which may
include root params on the warm path. Cross-request joiners must await
metadata for root param verification before forking the stream. If the
key mismatches (different root params), the joiner retries with a
recomputed key. Because metadata is checked before `fork()` is called, a
mismatched joiner never consumes a stream from the wrong entry.

Stream tee-ing is lazy via the `SharedCacheEntry` class: the leader
calls `fork()` to get its copy, and each joiner calls `fork()` on
demand. If no joiners exist, only one tee occurs. The
`SharedCacheResult` discriminated union wraps either a
`SharedCacheEntry` (for the cached case) or a hanging promise (for
`prerender-dynamic`).

Each invocation still decodes the stream independently via
`createFromReadableStream` with its own `temporaryReferences`, because
sharing the decoded result would cause cache poisoning when components
receive different non-serializable props (e.g. `children`).

This adds significant complexity to the `cache()` function. Two classes
help keep it manageable: `SharedCacheEntry` encapsulates stream
ownership and lazy tee-ing so that call sites don't need to reason about
which streams have been consumed or need cloning.
`ResolvableSharedCacheResult` manages the deferred promise, map
registration, and lazy cleanup. Entries stay in the dedup maps until
collection completes (so late-arriving invocations can join while the
leader streams), then clean up automatically on resolve or reject. A
follow-up refactoring to extract the cache handler lookup and generation
into a separate function would help break up the function's size.
@unstubbable unstubbable force-pushed the hl/use-cache-deduping-take-two branch from 5ddc140 to 631d825 Compare April 16, 2026 20:30
Copy link
Copy Markdown
Contributor

@gnoff gnoff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Leaving some general comments first. mostly for posterity

  1. I think we want to move to a world where cache entries can be streamed not just from early joiners but late ones as well. And not just in process but from the cache handler too. The most extreme version would be being able to write to the cache handler and read from it simultaneously. This PR demonstrates one of the biggest challenges with this which is what do you do if the key changes while you are generating the entry. You'd have to error or you'd have to never assume a key more general than the most specific one possible but this then defeats the purpose of caching since taken to it's limit (supporting arbitrary cookies) you are never going to match any entries but your own.

  2. More actionable than 1 we should consider reimplementing the internals using node streams. We already don't support Cache Components in edge runtime. We can likely make this layering more efficient by converting to web only at the interface with the cache handler.

Comment on lines +2570 to +2575
if (currentTime > entry.timestamp + entry.revalidate * 1000) {
// If this is stale, and we're not in a prerender (i.e. this is
// dynamic render), then we should warm up the cache with a fresh
// revalidated entry.
const result = await generateCacheEntry(
workStore,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know this was already in the codebase but isn't this wrong? When we are rendering a dynamic render we should be tolerant of stale results. we should kick of a background revalidation. but are we blocking here instead?

In a prerender path we should consider the stale entry a miss and force an eager revalidation b/c we don't want to prerender something stale and then lock it into the outer cache (ISR or nested "use cache") for much longer than we might have otherwise expected

Copy link
Copy Markdown
Contributor Author

@unstubbable unstubbable Apr 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know this was already in the codebase but isn't this wrong? When we are rendering a dynamic render we should be tolerant of stale results. we should kick of a background revalidation. but are we blocking here instead?

Yes, that is wrong. There was a community fix submitted recently: #92636

In a prerender path we should consider the stale entry a miss and force an eager revalidation b/c we don't want to prerender something stale and then lock it into the outer cache (ISR or nested "use cache") for much longer than we might have otherwise expected

That's how it already works.

@unstubbable unstubbable merged commit 1066fbf into canary Apr 17, 2026
183 checks passed
@unstubbable unstubbable deleted the hl/use-cache-deduping-take-two branch April 17, 2026 16:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants