Self-hosted AI infrastructure: Cursor AI assistant + Conversational AI (OpenWebUI) + RAG + Intelligent Automation (n8n)
A production-ready, self-hosted AI stack that combines:
- π€ AI Services: OpenWebUI interface, Qdrant vector DB, n8n automation, Ollama LLM
- π» Cursor Integration: MCP-Qdrant for AI-enhanced coding with context awareness
- π Knowledge Base: RAG with document upload in OpenWebUI
- π Intelligent Workflows: n8n automation platform
- π One-command deployment:
./init.shand everything works
β
Cursor AI Enhancement via Model Context Protocol (MCP)
β
OpenWebUI with RAG for document Q&A
β
Vector Search with Qdrant (1024-dim embeddings)
β
Production-ready Docker Compose stack
β
Automated Workflows with n8n orchestration
β
Fork-friendly - Clone once, everything works
# 1. Clone repository
git clone https://github.com/FlowTech-Lab/FlowTech-AI.git
cd FlowTech-AI
# 2. Initialize stack (requires sudo for permissions)
sudo ./init.sh
# β
Stack ready! Services available at:
# - OpenWebUI: http://localhost:8081
# - Cursor MCP: http://localhost:8000
# - n8n: http://localhost:5678
# - Qdrant: http://localhost:6333Full guide: See QUICKSTART.md
| Service | Port | Description | Status |
|---|---|---|---|
| OpenWebUI | 8081 | AI chat interface with RAG | β Production |
| MCP-Qdrant | 8000 | Cursor code context (cursor-context) | β Production |
| MCP-Knowledge | 8001 | Cursor notes search (cursor-knowledge) | β Production |
| n8n | 5678 | Workflow automation | β Production |
| Qdrant | 6333 | Vector database | β Production |
| Samba | 445 | Notes share (SMB) | β Production |
| PostgreSQL | 5432 | Metadata storage | β Production |
| Redis | 6379 | Cache & queues | β Production |
| SearxNG | 8082 | Web search engine | β Production |
| Langfuse | 3300 | LLM observability | β Production |
# 1. Copy MCP config template
cp cursor-mcp-config.json ~/.cursor/mcp.json
# 2. Edit IP (change to your server IP)
nano ~/.cursor/mcp.json
# Replace 192.168.0.246 with your actual IP
# 3. Restart Cursor
# 4. Test
@qdrant store "FlowTech-AI is awesome!"
@qdrant find awesomeWhat you get:
- Store code snippets, notes, and context in Qdrant
- Retrieve information semantically during coding
- AI-enhanced development with persistent memory
Sync Markdown notes to Cursor:
# Initial sync
./scripts/sync-notes.sh
# Install hourly auto-sync
./scripts/install-cron.shYour Notes/ folder will be automatically synced to Qdrant and searchable in Cursor!
π See Notes Sync Guide for details.
Edit notes from Windows/Mac/Linux:
# Windows
\\YOUR_SERVER_IP\notes
# Mac/Linux
smb://YOUR_SERVER_IP/notes
# Credentials (generated in .env during init.sh)
Username: admin
Password: Check your .env file (SAMBA_PASSWORD)Open the share with Obsidian or any text editor to manage your notes!
π See Samba Windows Guide for connection help.
- Open http://localhost:8081
- Create a new chat
- Click "Knowledge" button
- Upload your documents (PDF, MD, TXT, DOCX, etc.)
- Ask questions about your documents!
Example:
User: What does the API documentation say about authentication?
AI: Based on the uploaded API docs, authentication uses JWT tokens...
- π PDF, DOCX, TXT, MD
- π» Code files (PY, JS, TS, etc.)
- π Websites (via URL)
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β FlowTech-AI Stack β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β ββββββββββββββββ ββββββββββββββββ ββββββββββββ β
β β Cursor IDE βββββΆβ MCP-Qdrant βββββΆβ Qdrant β β
β β (Dev Tool) β β (Port 8000) β β Vector β β
β ββββββββββββββββ ββββββββββββββββ β DB β β
β ββββββ¬ββββββ β
β ββββββββββββββββ ββββββββββββββββ β β
β β Browser βββββΆβ OpenWebUI ββββββββββΆβ β
β β β β (Port 8081) β β β
β ββββββββββββββββ ββββββββββββββββ β β
β β β
β ββββββββββββββββ ββββββββββββββββ β β
β β n8n Web βββββΆβ n8n ββββββββββΆβ β
β β Interface β β (Port 5678) β β β
β ββββββββββββββββ ββββββββββββββββ β β
β β β
β ββββββββββββββββββββββββββββββββββββββββββββββ β
β β β
β β ββββββββββββ ββββββββββββ ββββββββββββ β
β βββΆβ Redis β β Postgres β β SearxNG β β
β ββββββββββββ ββββββββββββ ββββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Data Flow:
- Cursor: Store/retrieve code context via MCP β Qdrant
- OpenWebUI: Upload docs β RAG β Qdrant β AI answers
- n8n: Automate workflows, integrate external APIs
- Open http://localhost:5678
- Login with credentials from
.envfile - Import workflow from
SRC/FlowTech-AI-Complete-Workflow.json - Customize for your needs
Default: bge-m3:567m (Ollama) - Multilingual, 1024 dimensions
Change model:
# Edit .env
RAG_EMBEDDING_MODEL=bge-large:latest # or another model
# Restart
docker compose restart openwebuiWant to sync personal notes? Check the Notes Templates in Notes/_Templates/:
vm-template.md- For virtual machinesserver-template.md- For serversdomain-template.md- For domains
For automated sync, see companion repo: Flow-Notes-AI
- QUICKSTART.md - Complete setup guide
- Notes/_Templates/ - Obsidian note templates
- docs/ - Architecture & advanced topics
Contributions are welcome! Please:
- Fork the repo
- Create a feature branch
- Test your changes with
./init.sh - Submit a pull request
MIT License - See LICENSE for details
- OpenWebUI - Amazing AI interface
- Cursor - Best AI-powered IDE
- n8n - Powerful automation platform
- Qdrant - High-performance vector database
- Ollama - Local LLM inference
- GitHub: https://github.com/FlowTech-Lab/FlowTech-AI
- Documentation: docs/
- Issues: https://github.com/FlowTech-Lab/FlowTech-AI/issues
Made with β€οΈ by the FlowTech-Lab community