Durable, resumable fetch() for browsers and Node.js – powered by Cloudflare Durable Objects.
Send a long-running request (e.g. OpenAI streaming), close the tab, come back later, and pick up the stream exactly where you left off.
npm i durablefetch- Zero-config CDN endpoint:
https://durablefetch.com - Self-host in minutes (Cloudflare Workers)
- Add resumability to a ChatGPT like interface. Allow user to send message and close the browser.
- Run long running jobs in the background. Let the user submit a form and close the browser while the server runs a long running task
- Start long running cronjobs with GitHub actions without consuming Actions minutes and money, start the request and abort it right after, durablefetch will keep the request running.
- Run image generation in the background.
To see how durablefetch works you can try visiting this url in the browser in different tabs: https://durablefetch.com/postman-echo.com/server-events/20?randomId=xxxx
Important
durablefetch identifies requests by the URL, each different request should have an unique URL. For example for a ChatGPT like interface you would use the chat id or message id.
Typical HTTP streams die when the client disconnects.
durablefetch puts a Cloudflare Durable Object between you and the origin:
- First request → DO starts the upstream fetch in
waitUntil(). - Every chunk is persisted (
state.storage) and fanned-out to all live clients. - If the browser drops, the DO keeps going.
- Later requests with the same URL → stored chunks replayed, live stream continues.
- Once the origin finishes, the DO marks the conversation complete; subsequent callers just get the full buffered response.
Persistence lasts for a few hours (6 hours by default).
durablefetch does NOT automatically retry failed requests. If you need retry logic with exponential backoff, you should implement it client-side.
-
Successful responses (2xx status codes): The response body is cached and replayed on subsequent fetches to the same URL. This cache persists for up to 6 hours (configurable) or until you call
df.delete(url). -
Failed responses (non-2xx status codes): Error responses are NOT cached. Each new request with the same URL will trigger a fresh upstream fetch to the server. This allows transient errors to be retried by making a new request.
Example:
const df = new DurableFetchClient()
// First request returns 500 error
const res1 = await df.fetch('https://api.example.com/endpoint?id=123')
console.log(res1.status) // 500
// Second request to same URL will retry the upstream server (not replay cached error)
const res2 = await df.fetch('https://api.example.com/endpoint?id=123')
console.log(res2.status) // May succeed with 200 if the error was transientCall df.delete(url) to clear the cached response when you want to make a completely fresh request:
await df.delete('https://api.example.com/endpoint?id=123')
// Next fetch will start a new upstream requestHere's a complete example showing how to implement retries with durablefetch, including SSE error handling and state persistence:
import { DurableFetchClient } from 'durablefetch'
const df = new DurableFetchClient()
async function processStreamWithRetries(taskId: string) {
const url = `https://api.example.com/stream?taskId=${taskId}`
const stateKey = `stream-state-${taskId}`
let retryCount = 0
const maxRetries = 3
// Load previous state from localStorage (for resume after errors)
let state = localStorage.getItem(stateKey)
? JSON.parse(localStorage.getItem(stateKey)!)
: { itemsProcessed: 0, lastItemId: null }
while (retryCount < maxRetries) {
try {
// durablefetch will replay any previously cached response from this URL:
// - If user left the page and came back: replays all previous chunks then continues
// the live stream, making it seamless as if they never left
// - If previous response had an SSE error: replays the previous events, we update state, we catch the error,
// delete the cache, and retry with a fresh request
const response = await df.fetch(url, {
method: 'POST',
body: JSON.stringify({
resumeFrom: state.lastItemId // Pass state to resume from where we left off
}),
})
// Parse SSE stream and handle errors
for await (const event of parseSSE(response.body!)) {
// SSE error events indicate server-side errors
// These errors should trigger a retry
if (event.event === 'error') {
throw new Error(`SSE error: ${event.data}`)
}
// Process successful events
const data = JSON.parse(event.data)
console.log('Processing item:', data.itemId)
// Update state as we process items
state = {
itemsProcessed: state.itemsProcessed + 1,
lastItemId: data.itemId
}
// Persist state for resume capability
localStorage.setItem(stateKey, JSON.stringify(state))
}
// Success! Clear state and cached response
localStorage.removeItem(stateKey)
await df.delete(url)
return state
} catch (error) {
// Always delete cached response on error so retry gets fresh data
await df.delete(url)
retryCount++
console.error(`Stream error (attempt ${retryCount}/${maxRetries}):`, error)
if (retryCount >= maxRetries) {
console.error('Max retries reached')
throw error
}
// Wait before retry with exponential backoff
const delay = Math.min(1000 * Math.pow(2, retryCount - 1), 10000)
console.log(`Waiting ${delay}ms before retry...`)
await sleep(delay)
console.log(`Retrying from state:`, state)
}
}
}Simply replace the default fetch with durablefetch to make sure the AI response is resumed if an existing one is alreay in progress.
Warning
when using an url with localhost durablefetch will by bypassed, this is because the server cannot reach your localhost server
import { DefaultChatTransport } from 'ai'
import { DurableFetchClient } from 'durablefetch'
const df = new DurableFetchClient()
export function Chat({ chatId }) {
const api = `/api/chat?chatId=${chatId}`
useEffect(() => {
// if in progress, send message and resume stream with durablefetch
df.isInProgress(api).then(({ inProgress }) => {
if (inProgress) {
const text = localStorage.getItem('lastMessage') || ''
return sendMessage({
text,
})
}
})
}, [])
const { messages, sendMessage, error } = useChat({
transport: new DefaultChatTransport({ api, fetch: df.fetch }),
id: chatId,
})
return (
<div className='flex flex-col w-full h-dvh py-8 stretch'>
<div className='space-y-4 flex-grow overflow-y-auto'>
{messages.map((m) => {
if (!m.parts?.length) return
return (
<div key={m.id} className='max-w-xl mx-auto'>
<div className='font-bold'>{m.role}</div>
<div className='space-y-4'>
{m.parts.map((p, i) => {
switch (p.type) {
case 'text':
return (
<div key={i}>
<p>{p.text}</p>
</div>
)
}
})}
</div>
</div>
)
})}
</div>
<form
onSubmit={async (e) => {
e.preventDefault()
const message = new FormData(e.currentTarget).get('message')
await df.delete(api) // clear durablefetch state so new message makes a new request
sendMessage({ text: message })
localStorage.setItem('lastMessage', message)
e.currentTarget.reset()
}}
className='w-full max-w-xl mx-auto'
>
<input
autoFocus
className='w-full p-2 border border-gray-300 rounded shadow-xl'
name='message'
placeholder='Say something...'
/>
</form>
</div>
)
}import { DurableFetchClient } from 'durablefetch'
const df = new DurableFetchClient() // defaults to durablefetch.com
// 1. Start a streaming request
const res = await df.fetch(
'https://api.openai.com/v1/chat/completions?chatId=xxx',
{
method: 'POST',
body: JSON.stringify({
/* … */
}),
},
)
// 2. Other fetch requests to the same url resumes the existing request or return the already completed response
// 3. Ask whether the stream is still in progress (optional)
const status = await df.isInProgress(
'https://api.openai.com/v1/chat/completions?chatId=xxx',
)
console.log(status) // { inProgress: true, activeConnections: 1, chunksStored: 42, completed: false }
// 4. Delete a stored response (optional)
const deleteResult = await df.delete(
'https://api.openai.com/v1/chat/completions?chatId=xxx',
)
console.log(deleteResult) // { success: true, message: "Response deleted successfully" }durablefetch works by passing the host as the first part of the path, then the path of the url:
https://durablefetch.com/:domain/*
So for example this request makes a fetch to https://postman-echo.com/server-events/20?randomId=xxxx
https://durablefetch.com/postman-echo.com/server-events/20?randomId=xxxx
Always remember to make your URLs unique and non guessable
Always add a non guessable unique search parameter to the url so urls cannot be guessed and be used to read the response by non authrorized users.
The response in durablefetch is deleted from the last use after 6 hours.
If you are going to attach secret data to headers like authroization tokens you should self host in your own Cloudflare account.
Here is a demo AI app that uses durablefetch: https://github.com/remorses/durablefetch-ai-demo