diff --git a/site/public/llms-full.txt b/site/public/llms-full.txt index 7c61986b28..66899ea49f 100644 --- a/site/public/llms-full.txt +++ b/site/public/llms-full.txt @@ -6303,42 +6303,49 @@ Follow the [Railway Quick Start guide](https://docs.railway.com/quick-start) to ## Challenges of Building AI Agents -Common tools: LangChain, OpenAI SDK, custom state management systems, Redis for memory storage - Main pain points: -- Managing conversation history and agent memory across sessions requires external databases -- Keeping context and state synchronized while handling multiple concurrent conversations -- High latency from database round trips to fetch conversation history -- Complex infrastructure for realtime streaming responses to users -- Handling tool calls and maintaining agent state during long-running operations + +- **Long-running & complex workflows**: Orchestrating multi-step tool calls, handling timeouts, and maintaining state consistency across operations +- **Fault tolerance & recovery**: Gracefully handling agent failures, network issues, and LLM API errors without losing conversation state +- **Real-time streaming**: Building infrastructure for streaming responses while handling backpressure and connection management +- **State persistence & isolation**: Managing conversation history and agent memory per user without complex database setups +- **Observability & debugging**: Understanding agent decision-making, tracking tool usage, and debugging complex behaviors ## How Rivet Solves This -Rivet makes building stateful AI agents simple by providing durable actors that keep conversation history and agent state in memory without external databases. +Rivet provides a complete actor-based runtime designed specifically for stateful AI agents, addressing each challenge: -**Persistent Memory**: Each agent actor maintains its own state including conversation history, tool results, and context. State survives restarts and deployments automatically. +**Long-running & complex workflows**: Actors naturally handle multi-step operations with built-in state management. Tool calls execute within the actor context, maintaining consistency across all operations. Workflows can span hours or days without losing state. ```typescript -const aiAgent = actor( - }, - - actions: ); +const aiAgent = actor(, - const response = await callLLM(c.state.messages); - c.state.messages.push(); - - c.broadcast("message", ); - return response; + actions: + } } } }); ``` -**Realtime Streaming**: Use WebSocket connections to stream LLM responses in realtime without additional infrastructure. Learn more about [events](/docs/actors/events). +**Fault tolerance & recovery**: State automatically persists to durable storage. If an agent crashes, it resumes exactly where it left off with full conversation history and context intact. Network failures and LLM API errors don't lose progress. -**Tool Integration**: Execute tool calls within the actor context while maintaining state consistency. See [actions](/docs/actors/actions) for more details. +**Real-time streaming**: Built-in WebSocket support with automatic connection management. Stream LLM responses directly to clients without building custom infrastructure. Backpressure and reconnection handled automatically. Learn more about [events](/docs/actors/events). + +```typescript +// Stream responses directly to connected clients +c.broadcast("stream", ); +``` + +**State persistence & isolation**: Each agent actor is automatically isolated per user/conversation. State persists without external databases - conversation history, tool results, and context are maintained in-actor memory with automatic durability. Read about [actor lifecycle](/docs/actors/lifecycle). + +**Observability & debugging**: Full visibility into agent behavior through structured logging, metrics, and state inspection. Track every tool call, decision point, and state change. Debug production issues with complete audit trails. + +```typescript +// Automatic tracing of all actions and state changes +c.log("Tool executed", ); +``` -**No Cold Starts**: Agents hibernate when idle and wake instantly when needed, keeping conversation context ready without paying for idle time. Read about [actor lifecycle](/docs/actors/lifecycle). +**Bonus - No cold starts**: Agents hibernate when idle and wake instantly when needed, keeping conversation context ready without paying for idle compute. See [actions](/docs/actors/actions) for more details. ## Full Example Projects ## Background Jobs diff --git a/site/public/llms.txt b/site/public/llms.txt index 0ce9a10368..0b1c9e9a3c 100644 --- a/site/public/llms.txt +++ b/site/public/llms.txt @@ -23,6 +23,7 @@ https://rivet.dev/blog/2025-1-12-rivet-inspector https://rivet.dev/blog/2025-10-01-railway-selfhost https://rivet.dev/blog/2025-10-05-weekly-updates https://rivet.dev/blog/2025-10-09-rivet-cloud-launch +https://rivet.dev/blog/2025-10-17-rivet-actors-vercel https://rivet.dev/blog/godot-multiplayer-compared-to-unity https://rivet.dev/changelog https://rivet.dev/changelog.json @@ -46,6 +47,7 @@ https://rivet.dev/changelog/2025-1-12-rivet-inspector https://rivet.dev/changelog/2025-10-01-railway-selfhost https://rivet.dev/changelog/2025-10-05-weekly-updates https://rivet.dev/changelog/2025-10-09-rivet-cloud-launch +https://rivet.dev/changelog/2025-10-17-rivet-actors-vercel https://rivet.dev/changelog/godot-multiplayer-compared-to-unity https://rivet.dev/cloud https://rivet.dev/docs/actors diff --git a/site/src/posts/2025-10-17-rivet-actors-vercel/image.png b/site/src/posts/2025-10-17-rivet-actors-vercel/image.png new file mode 100644 index 0000000000..c125eaaa72 Binary files /dev/null and b/site/src/posts/2025-10-17-rivet-actors-vercel/image.png differ diff --git a/site/src/posts/2025-10-17-rivet-actors-vercel/page.mdx b/site/src/posts/2025-10-17-rivet-actors-vercel/page.mdx new file mode 100644 index 0000000000..fb1dcddde6 --- /dev/null +++ b/site/src/posts/2025-10-17-rivet-actors-vercel/page.mdx @@ -0,0 +1,220 @@ +export const author = "nicholas-kissel" +export const published = "2025-10-17" +export const category = "changelog" +export const keywords = ["vercel", "rivet", "actors", "durable-objects", "open-source", "serverless", "launch"] + +# Rivet Actors on Vercel Functions: Open-Source Alternative to Durable Objects + +**Rivet Actors can now run on Vercel Functions, bringing stateful, realtime workloads to Vercel's serverless platform.** + +Cloudflare's Durable Objects introduced a long-lived, stateful primitive in serverless apps – but they come with platform lock-in and resource constraints. Rivet Actors offer an open-source alternative, and by launching on Vercel we unlock better flexibility, control, and developer experience. + +## How Vercel + Rivet Compares To Durable Objects + +| Dimension | Rivet Actors on Vercel Functions | Cloudflare Durable Objects | Why it matters | +|--------------------------------|---------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------| +| **Runtime** | **Standard Node.js** (Vercel Functions), full support with npm packages | **Custom runtime** (workerd), subset of Node.js APIs, partial support with npm packages | Using standard Node.js makes packages and frameworks "just work" and reduces vendor lock-in. | +| **Memory / CPU per actor** | **Configurable** up to 4 GB / 2 vCPU | Per-isolate memory cap **128 MB** | Memory-heavy workloads are more feasible on Vercel than on Cloudflare | +| **Regional control** | **Configurable** specific regions | DOs can be restricted to broad jurisdictions and accept location hints, though **limited control** | Explicit control helps reduce latency and meet compliance requirements | +| **Lock-in / portability** | **Open-source actor library** designed to be portable across standard runtimes/clouds | DOs run on Cloudflare's runtime and APIs, **not open-source, not portable** | Open source + standard runtimes provide flexibility, enables migrations, and allows for on-prem deployments | + +## What Are Rivet Actors? + +Similar to Durable Objects, Rivet Actors provide long-lived, stateful actors with **storage, real-time (WebSockets/SSE), and hibernation**. However, unlike Durable Objects, Rivet is open-source and portable — you can self-host or run on any platform. + +- **Long-Lived, Stateful Compute**: Each unit of compute is like a tiny server that remembers things between requests – no need to re-fetch data from a database or worry about timeouts. Like AWS Lambda, but with persistent memory between invocations and no timeouts. + +- **Realtime, Made Simple**: Update state and broadcast changes in realtime with WebSockets or SSE. No external pub/sub systems, no polling – just built-in low-latency events. + +- **No Database Round Trips**: State is stored on the same machine as your compute so reads and writes are ultra-fast. No database round trips, no latency spikes. + +- **Sleep When Idle, No Cold Starts**: Actors automatically hibernate when idle and wake up instantly on demand with zero cold start delays. Only pay for active compute time. + +- **Architected For Insane Scale**: Automatically scale from zero to millions of concurrent actors. Pay only for what you use with instant scaling and no cold starts. + +- **No Vendor Lock-In**: Open-source and fully self-hostable. + +## Quick Example: Building an AI Agent with Rivet Actors + +Here's how simple it is to build a stateful AI agent using Rivet Actors on Vercel. This example demonstrates an AI chatbot that maintains conversation history, processes messages with OpenAI, and broadcasts updates to all connected clients in real-time – all without managing databases or WebSocket infrastructure. + + + +```typescript {{"title":"src/app/rivet/registry.ts"}} +import { actor, setup } from "rivetkit"; +import { openai } from "@ai-sdk/openai"; +import { generateText, tool } from "ai"; +import { z } from "zod"; +import { type Message, getWeather } from "./utils"; + +// Create an actor for every agent +export const aiAgent = actor({ + // Persistent state that survives restarts + state: { + messages: [] as Message[], + }, + + // Actions are callable by your frontend or backend + actions: { + getMessages: (c) => c.state.messages, + + sendMessage: async (c, userMessage: string) => { + const userMsg: Message = { + role: "user", + content: userMessage, + timestamp: Date.now(), + }; + + // State changes are automatically persisted + c.state.messages.push(userMsg); + + const { text } = await generateText({ + model: openai("gpt-4o"), + prompt: userMessage, + messages: c.state.messages, + tools: { + weather: tool({ + description: "Get the weather in a location", + parameters: z.object({ + location: z + .string() + .describe("The location to get the weather for"), + }), + execute: async ({ location }) => { + return await getWeather(location); + }, + }), + }, + }); + + const assistantMsg: Message = { + role: "assistant", + content: text, + timestamp: Date.now(), + }; + c.state.messages.push(assistantMsg); + + // Send realtime events to all connected clients + c.broadcast("messageReceived", assistantMsg); + + return assistantMsg; + }, + }, +}); + +export const registry = setup({ + use: { aiAgent }, +}); +``` + +```typescript {{"title":"src/app/components/AgentChat.tsx"}} +import { createRivetKit } from "@rivetkit/next-js/client"; +import { useEffect, useState } from "react"; +import { registry } from "../rivet/registry"; +import type { Message } from "../backend/types"; + +const { useActor } = createRivetKit(); + +export function AgentChat() { + // Connect to the actor + const aiAgent = useActor({ + name: "aiAgent", + key: ["default"], + }); + + const [messages, setMessages] = useState([]); + const [input, setInput] = useState(""); + const [isLoading, setIsLoading] = useState(false); + + // Fetch initial messages on load + useEffect(() => { + if (aiAgent.connection) { + aiAgent.connection.getMessages().then(setMessages); + } + }, [aiAgent.connection]); + + // Subscribe to realtime events + aiAgent.useEvent("messageReceived", (message: Message) => { + setMessages((prev) => [...prev, message]); + setIsLoading(false); + }); + + const handleSendMessage = async () => { + if (aiAgent.connection && input.trim()) { + setIsLoading(true); + + const userMessage = { role: "user", content: input, timestamp: Date.now() } as Message; + setMessages((prev) => [...prev, userMessage]); + + await aiAgent.connection.sendMessage(input); + setInput(""); + } + }; + + return ( +
+
+ {messages.length === 0 ? ( +
+ Ask the AI assistant a question to get started +
+ ) : ( + messages.map((msg, i) => ( +
+
{msg.role === "user" ? "👤" : "🤖"}
+
{msg.content}
+
+ )) + )} + {isLoading && ( +
+
🤖
+
Thinking...
+
+ )} +
+ +
+ setInput(e.target.value)} + onKeyPress={(e) => e.key === "Enter" && handleSendMessage()} + placeholder="Ask the AI assistant..." + disabled={isLoading} + /> + +
+
+ ); +} + +``` + +
+ +## Why run Rivet on Vercel? + +Running Rivet Actors on Vercel sidesteps several constraints with Cloudflare Durable Objects: + +- **Industry standard runtime**: Runs on Node.js today (with Bun coming soon to Vercel Functions). No proprietary runtime, use the Node APIs and NPM packages you already know. +- **More memory & CPU**: Vercel Functions let you configure memory and vCPU per function (up to 4 GB RAM, 2 vCPU cores). This enables larger PDF processing, multiplayer games, agent context, and heavier compute that would exceed typical isolate caps. +- **Developer experience**: Deploy actors alongside your Next.js app in one place, leverage Vercel's best-in-class observability, and use Rivet's built-in inspector for deep visibility into your actors. +- **Regions you choose.** Set regions directly to keep actors close to your primary database or within compliance boundaries. +- **Portability by design.** Your actor logic is built on RivetKit — an open-source library — and runs on industry standard runtimes. If your needs change, you can move between clouds or self-host without rewriting your application. + +## Getting started on Vercel + +1. Sign in to [Rivet Cloud](https://dashboard.rivet.dev) or [self-host Rivet](/docs/self-hosting) +2. Select _Vercel_ +3. Follow instructions to deploy the [Vercel starter template](https://github.com/rivet-dev/template-vercel) or [integrate Rivet into your existing Next.js application](https://www.rivet.dev/docs/actors/quickstart/next-js/) + +## Support + +- [GitHub Issues](https://github.com/rivet-dev/rivet/issues) +- [Discord](https://rivet.dev/discord) +