This repo is a demo of Durable Streams as the transport and persistence layer for a collaborative AI writing app.
Live demo: collaborative-ai-editor.examples.electric-sql.com
The app combines two Durable Streams integrations:
- Durable Streams + Yjs for shared ProseMirror document collaboration over plain HTTP.
- Durable Streams + TanStack AI for resilient chat sessions, streamed model output, and tool-driven agent interaction.
On top of that, the demo shows an AI collaborator called Electra that can:
- join a collaborative document as a separate participant
- inspect and edit the shared document through tools
- stream generated content into the document
- stream its activity into chat
- survive refreshes and reconnects through Durable Streams-backed session state
At a high level, this app is demonstrating one core idea:
- Durable Streams can act as the shared, resumable HTTP data plane for both collaborative editing and collaborative agent/chat workflows in the same application.
In practice that means:
- the document is a shared Yjs/ProseMirror document
- presence and document updates are synchronized through the Durable Streams Yjs provider
- chat messages and model/tool stream events are synchronized through the Durable Streams TanStack AI transport
- multiple tabs/devices can reconnect and resume both document state and chat state
TanStack Startpowers the full-stack app shell, server routes, and development workflow.TanStack Routerhandles file-based routing for the homepage, document page, and API endpoints.Reactrenders the editor UI, chat UI, and collaboration chrome.Viteprovides the local dev server and build pipeline.TypeScriptprovides the application’s static typing and editor tooling.
ProseMirroris the structured rich-text editor model used for the shared document.@handlewithcare/react-prosemirrorprovides a React-friendly ProseMirror integration layer.Yjsis the CRDT used for collaborative document state.y-prosemirrorbinds the ProseMirror document to the shared Yjs state.y-protocolsprovides awareness/presence support for cursors and participant state.
@durable-streams/y-durable-streamssyncs the Yjs document over Durable Streams using plain HTTP.@durable-streams/tanstack-ai-transportprovides durable chat session transport for TanStack AI.@durable-streams/serverruns the local Durable Streams server used by the demo.
@tanstack/airuns the model/tool loop and stream processing.@tanstack/ai-reactprovides theuseChathook used by the sidebar chat UI.@tanstack/ai-openaiconnects the app’s agent loop to OpenAI models.- OpenAI Responses API is the underlying model API used for generation.
- A tool-driven document editing runtime coordinates selection, insertion, deletion, rewrite, and formatting operations on the shared document.
- Streamed insertion and rewrite flows let the agent progressively update the document rather than applying one final patch at the end.
streaming-markdownis used to interpret streamed markdown into structured editor content.zodvalidates tool inputs and keeps the tool contract typed and explicit.
@base-ui/reactprovides unstyled accessible primitives for the UI.react-iconssupplies the toolbar and chrome icons.- Plain CSS styles the editor, homepage, chat UI, tool disclosures, and modals without an extra styling framework.
Vitestruns deterministic unit tests for the editor, tools, routing, and markdown behavior.- Live model-backed evals run against OpenAI to verify actual tool usage and document-editing behavior end to end.
localhost:3000- TanStack Start app (editor + chat UI)127.0.0.1:4437- Durable Streams server127.0.0.1:4438- Yjs server backed by Durable Streams (@durable-streams/y-durable-streams/server)
The app uses the Yjs base URL:
http://127.0.0.1:4438/v1/yjs/y-llm-demo-v2
- Node.js 20+
- pnpm 10+
- Install dependencies:
pnpm install- Create
.envin the repo root:
OPENAI_API_KEY=your_openai_key_here
OPENAI_MODEL=gpt-5.4
APP_BASE_URL=http://localhost:3000
PUBLIC_APP_BASE_URL=http://localhost:3000
# Optional server-side upstream config for Durable Streams services
# DURABLE_STREAMS_YJS_BASE_URL=http://127.0.0.1:4438
# DURABLE_STREAMS_CHAT_BASE_URL=http://127.0.0.1:4437
# DURABLE_STREAMS_YJS_SECRET=your-yjs-secret
# DURABLE_STREAMS_CHAT_SECRET=your-chat-secretStart the app, Durable Streams server, and Yjs server together:
pnpm devOpen:
http://localhost:3000
On first load, enter a document name to create/join a room.
pnpm dev- run app + servers togetherpnpm dev:app- run app onlypnpm dev:ds- run Durable Streams + Yjs servers onlypnpm test:unit- deterministic unit testspnpm test:evals- live model-backed evalspnpm typecheck- TypeScript checkspnpm build- production buildpnpm preview- build and preview the Cloudflare Worker locally with Wranglerpnpm preview:vite- preview the raw Vite outputpnpm preview:cloudflare- build and preview the Cloudflare Worker locally with Wranglerpnpm deploy:cloudflare- build and deploy the app to Cloudflare Workerspnpm cf:typegen- generate Wrangler types if you add Cloudflare bindings later
This repo is configured to deploy the TanStack Start app to Cloudflare Workers via Nitro's
cloudflare_module preset.
What runs on Cloudflare:
- the app shell
- SSR/server routes
/api/chat/api/chat-stream/api/yjs/*proxy routes
What does not run on Cloudflare in this repo:
- the local Durable Streams dev server in
src/dev/durableStreamsServer.ts
Before deploying, make sure your production Durable Streams services are already hosted and
reachable from the public internet. The Worker cannot use 127.0.0.1.
nitro.config.mjstargets Cloudflare Workers and emits.output/server/wrangler.jsonwrangler.jsoncdefines the Worker name plus non-secret defaults for the custom domain.dev.vars.exampleshows the env vars needed for local Wrangler preview
Set these in the Cloudflare dashboard or with Wrangler:
APP_BASE_URL=https://collaborative-ai-editor.examples.electric-sql.comPUBLIC_APP_BASE_URL=https://collaborative-ai-editor.examples.electric-sql.comOPENAI_MODEL=gpt-5.4DURABLE_STREAMS_YJS_BASE_URL=<hosted yjs upstream>DURABLE_STREAMS_CHAT_BASE_URL=<hosted chat upstream>
For the Durable Streams values, this app supports either format:
- a plain origin such as
https://api.electric-sql.cloud - a full service URL such as
https://api.electric-sql.cloud/v1/yjs/<service-id>orhttps://api.electric-sql.cloud/v1/stream/<service-id>
Set these as secrets if you use them:
OPENAI_API_KEYDURABLE_STREAMS_YJS_SECRETDURABLE_STREAMS_CHAT_SECRET
- Copy
.dev.vars.exampleto.dev.vars. - Replace the placeholder Durable Streams URLs with your real hosted upstreams.
- Set the real
OPENAI_API_KEY. - Run:
pnpm preview:cloudflare- Authenticate Wrangler:
pnpm exec wrangler login- Build and deploy:
pnpm deploy:cloudflare- In the Cloudflare dashboard, add the custom domain:
collaborative-ai-editor.examples.electric-sql.com
Use a custom domain because the Worker is the application origin.
- Left pane: shared collaborative document
- Right pane: resilient chat session
- Chat and agent events are backed by Durable Streams
- Document collaboration is backed by Yjs over Durable Streams
- The agent can perform tool-driven document edits and stream insertions into the shared doc
- Document key is used for both Yjs room naming and chat session namespacing
- The editor is a ProseMirror document synchronized through Yjs.
- The Yjs provider is
@durable-streams/y-durable-streams, so collaboration works over HTTP rather than a dedicated WebSocket stack. - The chat sidebar uses
@durable-streams/tanstack-ai-transport, so model responses, tool calls, and resumable session history flow through Durable Streams. - The server-side agent keeps its own editing/runtime state and writes back into the shared document.
- Make sure
pnpm devis running. - Confirm ports are listening:
127.0.0.1:4437(Durable Streams)127.0.0.1:4438(Yjs server)
- Hard refresh the browser after server restarts.
- Check browser network for Yjs requests to:
/v1/yjs/y-llm-demo-v2/docs/rooms/<doc>/v3/collaboration?...
- Verify
OPENAI_API_KEYexists in.env. - Restart
pnpm devafter changing env vars. - Check
/api/chatresponse body for server error text.
- Verify
OPENAI_API_KEYandOPENAI_MODELin.env. - Start with
gpt-5.4, which is the current baseline used in this repo. - Re-run
pnpm test:evals.
- The server route now auto-creates chat streams on read.
- If you still see 404, restart dev server and hard refresh.