A serverless Matrix bot powered by OpenAI ChatGPT, deployed on Cloudflare Workers with KV and R2 storage.
English | 简体中文
- 🤖 AI-Powered Conversations: Chat with OpenAI models in Matrix rooms
- 🔌 Custom API Providers: Support for custom OpenAI-compatible APIs (Azure, OpenRouter, local models, etc.)
- 💾 Persistent Storage: Conversation history stored in Cloudflare R2
- ⚡ Serverless: Runs entirely on Cloudflare Workers with Durable Objects
- 🌐 Global Edge Network: Low latency responses from Cloudflare's global network
- 🔒 Access Control: Admin and whitelist support
- 📝 Rich Commands: Multiple commands for configuration and management
- Cloudflare Workers: Serverless compute
- Durable Objects: Matrix sync state management
- KV Storage: Configuration and session data
- R2 Storage: Conversation history and logs
- Matrix Protocol: Client-Server API integration
- OpenAI API: Chat completions (customizable endpoint)
- Cloudflare account with Workers enabled (free tier supported)
- Matrix account (any homeserver)
- OpenAI API key or compatible API endpoint
Note: This bot works on Cloudflare's free tier! It uses new_sqlite_classes
for Durable Objects, which is compatible with free plans.
# Clone from GitHub
git clone https://github.com/yourusername/matrix-chatgpt-bot.git
cd matrix-chatgpt-bot
# Install dependencies
npm install
# Run automated setup script
./setup.sh
The setup script will:
- Create KV namespaces
- Create R2 buckets
- Generate
wrangler.toml
from template - Guide you through setting secrets
git clone https://github.com/yourusername/matrix-chatgpt-bot.git
cd matrix-chatgpt-bot
npm install
cp wrangler.toml.example wrangler.toml
Login to Cloudflare:
wrangler login
Create KV namespace:
wrangler kv:namespace create "KV"
wrangler kv:namespace create "KV" --preview
Create R2 bucket:
wrangler r2 bucket create matrix-bot-storage
wrangler r2 bucket create matrix-bot-storage-preview
Update wrangler.toml
with your namespace IDs.
wrangler secret put MATRIX_USER_ID
# Enter: @yourbotuser:matrix.org
wrangler secret put MATRIX_PASSWORD
# Enter: your_bot_password
wrangler secret put OPENAI_API_KEY
# Enter: sk-your-api-key
wrangler secret put OPENAI_BASE_URL
# Enter: https://api.openai.com/v1 (or custom URL)
wrangler secret put BOT_ADMIN_USERS
# Enter: @admin1:matrix.org,@admin2:matrix.org
wrangler deploy
# Login to Matrix
curl -X POST https://your-worker.workers.dev/login
# Start sync loop
curl https://your-worker.workers.dev/start
Mention the bot or use commands starting with !
:
!help
- Show available commands!gpt <message>
- Chat with GPT (anyone can use, no mention needed)!reset
- Clear conversation history!provider
- Show current AI provider!provider list
- List all available providers!provider set <name>
- Switch to a different provider!model <name>
- Set AI model for current room
!addprovider <name> <baseURL> <apiKey> [models...]
- Add new AI provider!delprovider <name>
- Remove AI provider!seturl <baseURL>
- Set default OpenAI base URL!stats
- Show bot statistics
There are three ways to interact with the bot:
!gpt what is the capital of France?
!gpt tell me a joke
@botuser:matrix.org what is the capital of France?
!help
!reset
!provider list
The bot supports any OpenAI-compatible API. Here are some examples:
!addprovider azure https://your-resource.openai.azure.com/openai/deployments/your-deployment YOUR_AZURE_KEY gpt-4 gpt-35-turbo
!provider set azure
!addprovider openrouter https://openrouter.ai/api/v1 YOUR_OPENROUTER_KEY gpt-4 claude-3-opus
!provider set openrouter
!addprovider local http://localhost:8000/v1 none llama-2-7b mistral-7b
!provider set local
!addprovider together https://api.together.xyz/v1 YOUR_TOGETHER_KEY meta-llama/Llama-2-70b-chat-hf
!provider set together
Each room can have its own configuration stored in KV:
{
provider: "openai", // AI provider name
model: "gpt-4", // Model name
temperature: 0.7, // Response randomness (0-2)
maxTokens: 2000, // Max response length
systemPrompt: "..." // Custom system prompt
}
Global settings stored in KV at config:global
:
{
defaultProvider: "openai",
defaultModel: "gpt-4",
maxContextMessages: 20 // Number of messages to keep in context
}
sync:token
- Matrix sync tokenauth:access_token
- Matrix access tokenauth:user_id
- Bot user IDauth:device_id
- Matrix device IDroom:config:{roomId}
- Room-specific configurationuser:settings:{userId}
- User preferencesprovider:{name}
- AI provider configurationconfig:global
- Global bot configurationconfig:admins
- List of admin usersconfig:whitelist
- List of allowed users
conversations/{roomId}/history.json
- Conversation historylogs/{date}/{roomId}/{timestamp}.json
- Message logsimages/{messageId}.{ext}
- Generated imagesuploads/{userId}/{fileId}
- User uploadsconfig/backup-{timestamp}.json
- Configuration backups
npm run dev
npm run build
See TEST_CHECKLIST.md for a complete step-by-step guide.
Quick test:
- Invite bot in Matrix:
/invite @your-bot-user-id:oi6.uk
- Wait 2-3 minutes for bot to accept
- Try the
!gpt
command:!gpt hello!
- Or mention the bot:
@your-bot-user-id:oi6.uk hello!
- Or use help:
!help
Important: Bot responds to:
!gpt <message>
- Easiest way, no mention needed@bot-user-id message
- Traditional mention!command
- Other bot commands
GET /
- Health checkGET /health
- Detailed health statusPOST /login
- Authenticate with MatrixGET /start
- Start sync loopGET /stop
- Stop sync loopGET /status
- Get sync status
If you get an error about new_classes
migration:
In order to use Durable Objects with a free plan, you must create a namespace using a `new_sqlite_classes` migration.
Make sure your wrangler.toml
uses new_sqlite_classes
instead of new_classes
:
[[migrations]]
tag = "v1"
new_sqlite_classes = ["MatrixSync"] # Use this for free plan
The KV binding name in wrangler.toml
must be "KV"
(uppercase):
[[kv_namespaces]]
binding = "KV" # Must match the code
id = "your_id_here"
The namespace title (shown in dashboard) can be anything, but the binding must be "KV".
- Check if sync is running:
curl https://your-worker.workers.dev/status
- Check logs in Cloudflare dashboard
- Verify authentication: Check KV for
auth:access_token
- Make sure the bot is invited to the room
- Verify API key is correct
- Check base URL format (should not end with
/
) - Test API endpoint directly with curl
- Check Cloudflare Workers logs
- Restart sync:
curl https://your-worker.workers.dev/start
- Check if authentication expired
- Re-login:
curl -X POST https://your-worker.workers.dev/login
- KV Write Limits: KV has a limit of 1 write per second per key
- Request Timeout: Workers timeout after 30 seconds for standard plan
- Memory: Workers have 128MB memory limit
- R2 Operations: Free tier includes 1M requests/month
- All API keys stored as Cloudflare secrets
- Admin commands restricted to configured admins
- Optional user whitelist for access control
- No credentials logged or exposed
MIT
Contributions welcome! Please open an issue or PR.
Inspired by: