A unified MCP server for Godot projects with Ollama-compatible endpoints
Give AI assistants structured access to your Godot project through MCP protocol and Ollama-compatible APIs on a single port. Semantic search available for Ollama models.
Built Specifically to work alongside Ai-Assistant-Hub. Ai-Assistant-Hub
- ✅ Runs on localhost only (127.0.0.1)
- ✅ Meant for single-user, local development
⚠️ No authentication - anyone on your machine can access⚠️ External tools execute with YOUR permissions - only use trusted tools⚠️ Understand the risks - While path sandboxed, the AI can still call any enabled/available tool
Safe usage:
- localhost only ✓
- Trusted tools only ✓
- Development machine ✓
Unsafe usage:
- Network exposure ✗
- Shared servers ✗
- Untrusted tools ✗
- Production environments ✗
- 🎯 Unified Server Architecture - Single process, single port (3333)
- 🔌 Ollama-Compatible Endpoints - Works with AI Assistant Hub, VS Code extensions, Rider plugins
- 🤖 Automatic Tool Calling - Transparent tool execution for Ollama-based clients
- 📚 Dual Protocol Support - MCP protocol for Claude Desktop/Cursor + Ollama API for everything else
Ollama:
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
# Windows
# Download from https://ollama.com/downloadGo 1.22+ (if building from source):
go version # Check versionLocal:
ollama serveRemote (ollama server):
- Point at your remote instance
- Configure
ollama_urlin mcp.json - Example:
"ollama_url": "http://192.168.1.100:11434"
# Chat model (recommended for tool calling)
ollama pull qwen3 # Large
ollama pull qwen2.5-coder:14b # Best balance
# Embedding model (for semantic search)
ollama pull nomic-embed-textVerify:
ollama list# Create addon directory in your Godot project
mkdir -p /path/to/your/godot/project/addons/gd-scope
cd /path/to/your/project/addons/gd-scope
# Install dependencies
go mod tidy
# Build
go build -o gd-scope .
| Tool | What it does |
|---|---|
read_file |
Read any project file |
list_files |
List files recursively |
project_info |
Parse project.godot |
list_scenes |
Find all .tscn files |
read_scene |
Parse scene hierarchy |
list_scripts |
Find all .gd/.cs files |
docs_versions |
List Godot doc versions |
docs_list |
List pages for a version |
docs_get |
Get documentation page |
docs_search |
Full-text doc search |
| Tool | What it does |
|---|---|
index_project |
Embed project files |
semantic_search |
Natural language search |
Add your own tools:
- Python/Rust/Node - JSON stdin/stdout
- GDScript - Full Godot engine API access
Minimal config (auto-detects project root):
{
"ollama_url": "http://127.0.0.1:11434"
}Full config:
{
"_comment_security": "Server binds to localhost only. Port configurable.",
"addr": "127.0.0.1:3333",
"_comment_ollama": "Can point to local or remote Ollama instance",
"ollama_url": "http://127.0.0.1:11434",
"project_root": ".",
"docs_dir": "docs",
"tools_dir": "tools",
"external_timeout_seconds": 30,
"godot_bin": "godot",
"embed_model": "nomic-embed-text",
"default_model": "qwen3"
}On the Ollama host (e.g., desktop):
export OLLAMA_HOST=0.0.0.0:11434
ollama serveIn gd-scope mcp.json:
{
"ollama_url": "http://192.168.1.100:11434"
}POST /api/chat- Chat with automatic tool callingPOST /api/generate- Single completion (proxied)GET /api/tags- List models (proxied)
GET /mcp/v1/tools- List available toolsPOST /mcp/v1/tools/{name}- Call a specific tool
GET /health- Server health check
- QUICKSTART.md - 5-minute setup guide
- AI-ASSISTANT-HUB.md - AI Assistant Hub integration
- OLLAMA-SETUP.md - Ollama installation and models
- TOOL-CREATION.md - Creating custom tools
- API-REFERENCE.md - Complete API documentation
- Go 1.22+ (for building)
- Ollama (for LLM and embeddings)
- Godot 4.x (only if using GDScript tools)
- Min. 4-8GB VRAM (for running Ollama models)
MIT License - see LICENSE file for details.
- Built for the Godot community
- Uses MCP Go SDK
- Designed to work with AI Assistant Hub
- Ollama integration for local AI