Universal MCP knowledge server for LLM agents, powered by local RAG.
Use Canon to provide domain-specific best practices and playbooks across software engineering, marketing, video editing, and other knowledge areas.
Typical workflows:
- Find the most suitable guide for a task
- Retrieve concise best-practice snippets
- Read full guides for deeper execution context
Install in Cursor
Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server
Pasting the following configuration into your Cursor ~/.cursor/mcp.json file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json in your project folder. See Cursor MCP docs for more info.
{
"mcpServers": {
"canon": {
"command": "uvx",
"args": ["mcp-canon"]
}
}
}
{
"mcpServers": {
"canon": {
"command": "uvx",
"args": ["mcp-canon"],
"env": {
"CANON_DB_PATH": "/path/to/my-db"
}
}
}
}{
"mcpServers": {
"canon": {
"url": "http://localhost:8080/mcp"
}
}
}Install in Claude Code
Run this command. See Claude Code MCP docs for more info.
claude mcp add --scope user canon -- uvx mcp-canonclaude mcp add --scope user -e CANON_DB_PATH=/path/to/my-db canon -- uvx mcp-canonclaude mcp add --scope user --transport http canon http://localhost:8080/mcpRemove
--scope userto install for the current project only.
Install in Opencode
Add this to your Opencode configuration file. See Opencode MCP docs for more info.
{
"mcp": {
"canon": {
"type": "local",
"command": ["uvx", "mcp-canon"],
"enabled": true
}
}
}{
"mcp": {
"canon": {
"type": "local",
"command": ["uvx", "mcp-canon"],
"enabled": true,
"environment": {
"CANON_DB_PATH": "/path/to/my-db"
}
}
}
}"mcp": {
"context7": {
"type": "remote",
"url": "http://localhost:8080/mcp",
"enabled": true
}
}Install in Gemini CLI
Run this command. See Gemini CLI MCP docs for more info.
gemini mcp add --scope user canon uvx mcp-canongemini mcp add --scope user -e CANON_DB_PATH=/path/to/my-db canon uvx mcp-canongemini mcp add --scope user --transport http canon http://localhost:8080/mcpRemove
--scope userto install for the current project only.
Install in Google Antigravity
Go to the agent panel and open: ... -> MCP Servers -> Manage MCP Servers -> View raw config.
Add this to your mcp_config.json file. See Google Antigravity MCP docs for more info.
{
"mcpServers": {
"canon": {
"command": "uvx",
"args": ["mcp-canon"]
}
}
}{
"mcpServers": {
"canon": {
"command": "uvx",
"args": ["mcp-canon"],
"env": {
"CANON_DB_PATH": "/path/to/my-db"
}
}
}
}Complete workflow from installation to running with your own domain guides.
pip install "mcp-canon[indexing]"my-library/
├── engineering/
│ └── python-fastapi-guide/
│ ├── INDEX.md # Required: metadata
│ └── GUIDE.md # Content
├── marketing/
│ └── launch-playbook/
│ ├── INDEX.md
│ └── GUIDE.md
└── video-editing/
└── shorts-workflow/
└── INDEX.md # Can reference external URL
Step 3: Create guides
# Index to custom location
canon index --library ./my-library --output /path/to/my-db
# Validate frontmatter before indexing (optional)
canon validate --library ./my-libraryFor remote access or multi-client scenarios, run Canon as an HTTP server. This is useful when multiple agents or teams share one cross-domain knowledge base.
pip install "mcp-canon[http]"# Default port 8080
canon serve
# Custom port and host
canon serve --port 3000 --host 0.0.0.0
# With custom database
CANON_DB_PATH=/path/to/db canon serve --port 8080{
"mcpServers": {
"canon": {
"url": "http://localhost:8080/mcp"
}
}
}| Variable | Description | Default |
|---|---|---|
CANON_DB_PATH |
Path to custom database | Bundled DB |
CANON_EMBEDDING_MODEL |
Fastembed model name (supported models) | nomic-ai/nomic-embed-text-v1.5-Q |
CANON_EMBEDDING_DIM |
Embedding vector dimensions (must match model) | 768 |
CANON_FASTEMBED_THREADS |
ONNX runtime threads for FastEmbed (lower = less RAM, slower) | auto |
CANON_FASTEMBED_BATCH_SIZE |
Embedding batch size during indexing (lower = less RAM, slower) | 256 |
CANON_FASTEMBED_PARALLEL |
FastEmbed data-parallel workers (>1 increases RAM usage) |
disabled |
CANON_LOG_LEVEL |
Log level (DEBUG, INFO, WARNING, ERROR) | INFO |
CANON_LOG_JSON |
Output logs in JSON format | false |
Note: Changing
CANON_EMBEDDING_MODELorCANON_EMBEDDING_DIMrequires a full reindex:canon index --library ./library
Internal constants EMBEDDING_MODEL_NAME and EMBEDDING_DIM are configured via:
CANON_EMBEDDING_MODELCANON_EMBEDDING_DIM
Example (using BAAI/bge-small-en-v1.5, 384 dims):
CANON_EMBEDDING_MODEL=BAAI/bge-small-en-v1.5 \
CANON_EMBEDDING_DIM=384 \
canon index --library ./library --output ./my-dbWhere to find available models:
- FastEmbed supported models: https://qdrant.github.io/fastembed/examples/Supported_Models/
- FastEmbed model card and usage notes: https://qdrant.github.io/fastembed/
Important:
CANON_EMBEDDING_DIMmust match the selected model output size.- After changing model or dimension, rebuild the index before running search/server commands.
| Tool | Description |
|---|---|
search_best_practices |
Semantic search for best practices in any domain (optionally scoped by guide_id) |
search_suitable_guides |
Find guides that match a task description across domains |
read_full_guide |
Get complete guide content for full context |
# Indexing
canon index --library ./library # Index guides from any domain (creates new DB)
canon index --library ./lib --append # Add to existing database
canon validate --library ./library # Validate frontmatter
# Server
canon serve --port 8080 # Start HTTP server (requires [http])
# Info
canon list # List indexed guides
canon info # Show database infoMIT