A production-ready, self-hosted AI knowledge platform integrating Obsidian, AnythingLLM, Ollama, and MCP agents to enable intelligent note augmentation, RAG, web search, and contextual AI β all fully local and privacy-preserving.
This system is a live deployment running on Unraid, designed to make my Obsidian vault AI-interactive.
AnythingLLM serves as the centralized AI frontend, orchestrating agents and LLM responses.
Ollama provides local inference with open-source models, ensuring data never leaves the server.
Through MCP agents, AnythingLLM connects directly to Obsidian and the web to enhance, summarize, and link knowledge intelligently.
- Unraid β Host and Docker orchestration
- Obsidian (Dockerized) β Central Markdown knowledge base
- AnythingLLM β AI orchestration and frontend interface
- Ollama β Local LLM runtime (offline model serving)
- MCP Server / Agents β Bridge connecting AnythingLLM to Obsidian, web tools, and RAG workflows
- Fully operational deployment on Unraid integrating Obsidian, AnythingLLM, and Ollama
- Obsidian MCP Server integration enabling AI-driven note access, summarization, and augmentation
- Configured web search, web scraping, and RAG workflows to merge live data with internal notes
- Automated semantic linking and structured documentation using local LLM processing
- Private by design: all operations and model inference occur locally with no third-party dependencies
User Device (Browser / Mobile)
ββ
ββΌ
AnythingLLM (UI & Agent Orchestrator)
ββ
ββΌ
MCP Agents β Obsidian Vault (search, read, augment)
ββ
ββΌ
Ollama (Local LLM Inference)
ββ
ββΌ
Web Search / Scraper Tools β RAG Context Integration
- MCP servers registered via
anythingllm_mcp_servers.json - AnythingLLM automatically starts MCP services when needed
- Ollama hosts and manages open-source models locally for offline AI capability
- Fully self-contained within local Unraid infrastructure
- Access routed through the Identity & Access Management (IAM) Suite:
- Authentik for centralized authentication
- Vaultwarden for credentials
- 2FAuth for TOTP MFA
- Nginx Reverse Proxy with Cloudflare DNS and Wildcard SSL
- Remote administration via Tailscale VPN (WireGuard-based), with SSH access disabled
- All containers deployed via Docker on Unraid
- Volumes mapped for persistent data (Obsidian vaults, LLM models, logs)
- AnythingLLM configured with custom MCP server JSON entries
- Ollama running multiple local LLM models (e.g., Mistral, Llama3, Phi3)
- Reverse proxy configured using IAM Suite for secure subdomain access (link)
- Local AI-assisted research, writing, and note organization
- RAG-based contextual search across personal documents
- Homelab deployment showcasing local LLM + automation integration
- Private AI workspace with full user control over data and access
Licensed under the MIT License β feel free to adapt or build upon this setup.
ai β’ self-hosted β’ knowledge-management β’ obsidian β’ anythingllm β’ ollama β’ mcp β’ rag β’ docker β’ privacy β’ homelab β’ llm β’ unraid