A comprehensive development ecosystem built around Claude Code with semantic memory, live streaming integration, and AI-powered tools.
- π§ Semantic Memory System - Perfect recall for conversations and chat history using vector embeddings
- πΊ Live Streaming Integration - Twitch chat embedding, OBS control, avatar expressions
- π΅ AI Audio Generation - ElevenLabs streaming with real-time voice synthesis
- π€ Claude Code Enhancement - Context preservation, automatic conversation embedding
- π§ MCP Server Ecosystem - Multiple Model Context Protocol servers for various integrations
packages/semantic-memory- Vector-based semantic memory with Mastra integrationpackages/client- Semantic memory client librarypackages/mcp-server- MCP server for semantic search and recall
packages/openai-complete-mcp- OpenAI integration for Claude Codepackages/twitch-embedder- Real-time Twitch chat embedding service
packages/elevenlabs-streaming- ElevenLabs audio streamingpackages/client- WebSocket client with buffered playbackpackages/mcp-server- MCP server with streaming support
packages/rustybutter-avatar- Avatar expression controller for OBSpackages/pump-chat- Pump.fun chat integrationpackages/client- WebSocket client for chat roomspackages/mcp-server- MCP server for pump.fun
packages/ABIDE- Automated Browser IDEpackages/app- Frontend applicationpackages/server- WebSocket serverpackages/mcp-server- MCP integration
packages/eslint-config- Shared ESLint configurationpackages/prettier-config- Shared Prettier configurationpackages/tsconfig- Shared TypeScript configuration
- Node.js >= 16.0.0
- pnpm >= 8.0.0
- OpenAI API key
- ElevenLabs API key (optional)
# Install dependencies for all packages
pnpm install
# Build all packages
pnpm run buildConfigure your API keys in the .env file:
# Copy the example file
cp .env.example .env
# Edit .env and add your API keys:
OPENAI_API_KEY=your-actual-openai-key
ELEVENLABS_API_KEY=your-actual-elevenlabs-key
# All other variables are pre-configured with sensible defaults# Launch Claude Code with all environment variables loaded
pnpm claude
# OR manually source environment and run claude
source scripts/load-env.sh
claudeThis automatically:
- β
Loads all environment variables from
.env - β Starts Claude Code with full MCP server configuration
- β Enables semantic memory, Twitch integration, avatar control, and more
Run commands in specific workspaces:
# Run command in specific package
pnpm --filter <package-name> <command>
# Examples:
pnpm --filter @elevenlabs-streaming/client build
pnpm --filter @pump-chat/mcp-server dev
pnpm --filter @abide/app devIndividual packages can be published to npm:
# Publish a specific package
pnpm --filter <package-name> publishPackages within the monorepo reference each other using the workspace:* protocol:
- Within
elevenlabs-streaming:@elevenlabs-streaming/mcp-serverdepends on@elevenlabs-streaming/client
- Within
pump-chat:@pump-chat/mcp-serverdepends on@pump-chat/client
These dependencies are automatically linked by pnpm during installation.
All MCP servers use environment variables for configuration. The project supports centralized .env file management.
# Twitch Configuration
TWITCH_USERNAME=rustybutterbot
TWITCH_OAUTH_TOKEN=oauth:your-twitch-token
TWITCH_CHANNEL=codingbutter
# OpenAI Configuration
OPENAI_API_KEY=your-openai-api-key
# ElevenLabs Configuration
ELEVENLABS_API_KEY=your-elevenlabs-api-key
ELEVENLABS_VOICE_ID=Au8OOcCmvsCaQpmULvvQ
ELEVENLABS_MODEL_ID=eleven_flash_v2
ELEVENLABS_OUTPUT_FORMAT=mp3_44100_64
ELEVENLABS_STABILITY=0.5
ELEVENLABS_SIMILARITY_BOOST=0.75
ELEVENLABS_STYLE=0.1
# Avatar & Streaming
AVATAR_SERVER_HOST=localhost
AVATAR_SERVER_PORT=3000
OBS_WEBSOCKET_URL=ws://172.25.208.1:4455
# Paths & Configuration
PROJECT_ROOT=/home/codingbutter/GitHub/rusty-butter
SEMANTIC_MEMORY_DB_PATH=${PROJECT_ROOT}/semantic_memory_db
CHAT_EMBEDDER_INTERVAL=30000pnpm claude- Launch Claude Code with environment loadedpnpm claude:show-env- Display all project environment variables
The project includes a powerful semantic memory system that provides "perfect recall" for conversations and chat history:
- Vector Embeddings: Uses OpenAI's text-embedding-3-small for semantic similarity
- Mastra Integration: Built on Mastra's LibSQL vector database for performance
- Real-time Embedding: Twitch chat messages are automatically embedded as they arrive
- Context Preservation: Claude Code conversations are preserved before context compaction
- Semantic Search: Search by meaning, not just keywords
# Search semantic memory
mcp__semantic-memory__semantic_search "query about embeddings"
# Recall with context
mcp__semantic-memory__recall "chat-history" "cookie talking about mastra"
# Get memory statistics
mcp__semantic-memory__get_statsReal-time service that embeds Twitch chat messages into semantic memory:
# Check service status
systemctl --user status twitch-embedder
# View logs
journalctl --user -u twitch-embedder -fControl OBS avatar expressions via MCP:
# Set avatar expression
mcp__rustybutter-avatar__setAvatarExpression "excited"
# List available expressions
mcp__rustybutter-avatar__listAvatarExpressions- Never commit real API keys to version control
- Use
.env.exampleas a template with placeholder values - Real API keys stay in your local
.envfile - Environment variables are only loaded when using
pnpm claude