A React app that runs an ElevenLabs Conversational AI agent over WebRTC with a layout inspired by ElevenLabs UI:
- Center: Main agent view with an animated orb that reacts to mic input and agent output
- Right side: Live transcript, summary, and audio level bars (mic + agent)
-
Install dependencies (already done if you cloned):
npm install
-
Configure your agent
Copy.env.exampleto.envand set your agent ID from ElevenLabs Conversational AI:cp .env.example .env
Edit
.env:VITE_ELEVENLABS_AGENT_ID=your_agent_id_hereUse a public agent, or implement a backend that returns a signed URL (WebSocket) or conversation token (WebRTC) and call it from the app.
-
Enable client events (for transcript)
In the ElevenLabs dashboard, open your agent → Advanced → enable Client events soonMessagereceives transcriptions and LLM replies.
npm run devThen open the URL shown (e.g. http://localhost:5173). Allow microphone access when prompted and click Start conversation.
npm run build
npm run preview # optional: preview production buildThis app is set up with WebSpatial so you can run it as a Packaged WebSpatial App on Apple Vision Pro (visionOS).
- Mac with Xcode and visionOS Simulator (Xcode → Settings → Platforms → install visionOS + visionOS Simulator)
- Icons: Add PWA icons so the visionOS app can be built. In
public/icons/add:icon-512.png(512×512, purpose: any)icon-1024-maskable.png(1024×1024, maskable, no transparency) Sample icons: WebSpatial icon examples
-
Start the WebSpatial dev server (serves the visionOS-specific build with hot reload):
npm run dev:avp
Note the URL (e.g.
http://localhost:5173/webspatial/avp/or another port). -
In another terminal, run the WebSpatial Builder to package and launch the app in the visionOS simulator:
XR_DEV_SERVER=http://localhost:5173/webspatial/avp/ npm run run:avp
Use the same origin and path as the URL from step 1. The simulator will start, install the app, and load your site from the dev server.
-
Signed .ipa for your device: Set
XR_PRE_SERVER(optional, base URL of your deployed site or leave unset to bundle assets),XR_BUNDLE_ID, andXR_TEAM_ID(Apple Developer), then:npm run build:avp:ipa
-
Production build (visionOS-specific assets only):
npm run build:avp
Output is under
dist/webspatial/avp/. Deploy that path so the Packaged WebSpatial App can load it (e.g. with--basewhen building the .ipa).
For more options and publishing to App Store Connect, see WebSpatial Builder docs.
- Vite + React + TypeScript
- @elevenlabs/react —
useConversation, WebRTC,onMessage, volume/frequency helpers - Tailwind CSS — layout and styling
- lucide-react — icons
- WebSpatial (
@webspatial/react-sdk,@webspatial/vite-plugin,@webspatial/builder) — run on visionOS as a Packaged WebSpatial App
- Center: Agent orb (volume-reactive visualization) and Start/End call button
- Right column: Transcript (live), Summary (from last assistant turns), Audio (mic + agent bar visualizer)
All of this uses the same patterns as the official ElevenLabs React/UI examples (orb, transcript, bar visualizer).