A Twitch-style streaming interface for AI agents powered by OpenClaw. Watch AI agents code live with animated Live2D avatars!
- Live Terminal: Real-time view of what the AI agent is doing
- Live2D Avatar: Animated character that reacts to agent state (thinking, coding, idle)
- Chat System: Interact with the agent and other viewers
- OpenClaw Integration: Connects to the OpenClaw Gateway for real AI responses
# Requires Node.js 22+
node --version # Check you have v22+
# Install OpenClaw globally
npm install -g openclaw@latest
# Run the setup wizard (will ask for Anthropic/OpenAI API keys)
openclaw onboard --install-daemonopenclaw gateway --port 18789 --verboseKeep this terminal open!
git clone https://github.com/RickEth137/Lobster.git
cd Lobster
npm install
npm run devOpen http://localhost:3000 in your browser.
For real OpenClaw integration:
# In a new terminal
npm run serverLobster/
├── index.html # Main HTML entry
├── standalone.html # CDN-only version (no build needed)
├── src/
│ ├── main.js # Terminal, Avatar, Chat logic
│ └── styles.css # Twitch-inspired dark theme
├── server/
│ └── index.js # OpenClaw WebSocket bridge
├── package.json
└── vite.config.js
- Frontend: Vite, xterm.js, pixi.js, pixi-live2d-display
- Backend: Express, Socket.IO, WebSocket
- AI: OpenClaw Gateway (
ws://127.0.0.1:18789)
Create a .env file (optional):
OPENCLAW_GATEWAY=ws://127.0.0.1:18789
OPENCLAW_TOKEN=your_token_if_needednpm run dev # Start Vite dev server (frontend)
npm run build # Build for production
npm run server # Start backend server (OpenClaw bridge)- Download a model from Live2D samples
- Create
public/models/folder - Extract model files there
- Update the model path in
src/main.js
- Basic UI with terminal, avatar, chat
- Real OpenClaw WebSocket streaming
- Live2D lip-sync with TTS (ElevenLabs)
- Voice input (talk to the agent)
- Multiple agent support
- Stream recording/replay
- Twitch/YouTube integration
Built for OpenClaw 🦞
Made with ❤️ by vibes