Fork side bars into separate chats without disrupting your main conversation flow.
It also uses json-render to let LLMs reliably call your custom React components like this AlgorithmVisualizer!
apps/backend: FastAPI app managed withuvapps/frontend: Vite + React infinite-canvas UIapps/imageservice: Express service for SVG/PNG image rendering
Install dependencies:
cd apps/backend
uv syncSet environment variables before running:
export OPENAI_API_KEY=your_key_here
export OPENAI_MODEL=your_model_here
export OPENAI_MODEL_OPTIONS=gpt-4.1-mini,gpt-4.1,gpt-4o-miniRun the API:
cd apps/backend
uv run uvicorn app.main:app --reloadThe backend listens on http://127.0.0.1:8000 and exposes:
GET /api/healthGET /api/chat/modelsPOST /api/chat/stream
Install dependencies:
cd apps/frontend
npm installRun the dev server:
cd apps/frontend
npm run devThe frontend listens on http://127.0.0.1:5173 and proxies /api requests to the backend.
Install dependencies:
cd apps/imageservice
npm installRun the service:
cd apps/imageservice
npm startThe image service listens on http://127.0.0.1:3001.
- Pannable and zoomable infinite canvas
- Multiple draggable chat windows
- Streaming model responses
- Per-window model picker with backend-driven model options
- Model badge on assistant responses
- Phrase-level branching from any completed user or assistant message
- Recursive child chats with inherited parent history snapshots
- Connector lines from the anchored phrase to the child chat window
- Cascade-close confirmation when a parent has descendants
- Starter questions for new chats
- Code syntax highlighting in messages
- LLM-driven interactive React components rendered inline in chat via json-render
- Cards, charts, tables, tabs, alerts, progress bars, and more
- AlgorithmStepper: step-through algorithm walkthroughs
- AlgorithmVisualizer: synchronized code + graph visualization with step highlighting
- Diagram component for trees, graphs, and state machines (nodes/edges)
- Image rendering service for static SVG/PNG generation
- Start the backend.
- Start the frontend.
- Open
http://127.0.0.1:5173. - Send a message in
Chat 1. - Highlight a phrase in any completed message and click
Branch.