Skip to content

sudoAPWH/LLMBot-Matrix

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLMBot

A Matrix chatbot written in Rust that provides AI-powered responses using any OpenAI-compatible API (Ollama Cloud, Gemini, OpenAI, etc.). Features end-to-end encryption, SAS emoji verification, mention-only responses, and admin-controlled configuration.

Building

Requires Rust (install via rustup):

cargo build --release

The binary will be at target/release/llmbot.

For development (faster compilation, slower runtime):

cargo build

Configuration

Copy the example config and edit it:

cp config.example.toml config.toml

Config Sections

[matrix]

Key Description
homeserver Your Matrix homeserver URL (e.g. https://matrix.example.com)
username Bot account username
password Bot account password
data_dir Directory for session data and crypto keys (default: data)

[ai]

Key Description
api_url OpenAI-compatible API endpoint (e.g. https://ollama.com/v1)
api_key API key (leave empty if not required)
model Model name (e.g. llama3, gemini-pro, gpt-4o)
system_message Default system prompt for all rooms
max_tokens Max tokens for AI response (default: 4096)
temperature Response creativity 0.0-1.0 (default: 0.7)

[bot]

Key Description
admin Admin Matrix user ID — the only user who can run !llmbot commands (e.g. @you:server.com)
display_name Bot display name in Matrix (default: LLMBot)
allowed_users List of Matrix user IDs allowed to chat (not configure)
allowed_rooms List of room IDs where anyone can chat
allowed_servers List of server names whose users can all chat

[database]

Key Description
path SQLite database file path (default: llmbot.db)

Access Control

There are three access levels:

  • Admin (single user set in bot.admin): Can use !llmbot commands and chat with the bot.
  • Allowed (users/rooms/servers lists): Can chat with the bot but cannot use !llmbot commands.
  • Everyone else: Gets "Sorry, you can't message me."

Running

# With default config.toml in current directory
cargo run --release

# With a specific config file
cargo run --release -- /path/to/config.toml

First Run

On first launch, the bot will:

  1. Log in with your username/password
  2. Save the session to data/session.json
  3. Join any rooms it's invited to

On subsequent launches, it restores the saved session — no new device is created, no re-verification needed.

Verification

After the first run, verify the bot from another Matrix client (e.g. Element):

  1. Go to Settings > Sessions (or Security & Privacy)
  2. Find the LLMBot session
  3. Click Verify and choose emoji verification
  4. The bot auto-accepts and auto-confirms — check the terminal output for the emojis
  5. Confirm the emojis match on your client

This only needs to be done once. The verification persists across restarts.

Resetting

If you need to start fresh (new device, new verification):

rm -rf data/

Then run again. You will need to re-verify.

Messaging the Bot

The bot only responds when mentioned — use @botusername or the display name in your message.

@llmbot is the earth flat?

How It Works

Each message to the bot is independent — it has no conversation memory on its own. However, it follows reply chains. When you reply to one of the bot's messages, it walks up the chain of replies to reconstruct the conversation, so it knows what was said before.

This means you can have a back-and-forth conversation:

  1. Mention the bot with a question — it replies.
  2. Reply to its reply — it reads the chain and responds with context.
  3. Reply to that reply — the chain grows, and so does the context.

As long as you keep replying to its messages (rather than sending a new standalone message), the bot sees the full thread. A new message with no reply is a fresh start with no history.

You can also reply to any message in a room and mention the bot — it will read that message as context:

> The earth is 6,000 years old
@llmbot is this accurate?

Admin Commands

All commands are admin-only and use the !llmbot prefix.

!llmbot help

Shows all available commands.

!llmbot botinfo

Displays bot configuration: display name, AI model, API URL, version.

!llmbot chat <message>

Sends a message directly to the AI (without needing to @mention).

!llmbot chat what is the speed of light?

!llmbot systemmessage [message]

View or set the system message for the current room.

# View current system message
!llmbot systemmessage

# Set a new system message for this room
!llmbot systemmessage You are a pirate. Answer everything like a pirate.

Room-specific system messages override the global one from config.toml.

!llmbot roomsettings [setting] [value]

View or change per-room settings.

# View all settings for this room
!llmbot roomsettings

# View a specific setting
!llmbot roomsettings always_reply

# Change a setting
!llmbot roomsettings always_reply false

Available settings:

Setting Type Default Description
system_message string global default System prompt for this room
always_reply bool true Reply to all mentions

Docker

docker compose up -d

Logs

By default, the bot logs at info level. For more detail:

RUST_LOG=debug cargo run

Files

Path Description
config.toml Your configuration (not committed to git)
config.example.toml Example configuration template
data/ Session data and crypto keys (not committed to git)
data/session.json Saved Matrix session
llmbot.db SQLite database for room settings

About

An encrypted Matrix chatbot powered by any OpenAI-compatible API. Built in Rust with the Matrix SDK.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors