100β―% local, zero vendor lock-in. PostgreSQLΒ 17 + pgvector + FastAPI + Ollama (qwen3-embedding:8b). No API keys, no hosted dependencies, completely free to run on your own machine.
MindBase is the memory substrate of the AIRIS Suite - providing persistent, semantic conversation history across all AI coding assistants.
| Component | Purpose | For Who |
|---|---|---|
| airis-agent | π§ Intelligence layer for all editors (confidence checks, deep research, self-review) | All developers using Claude Code, Cursor, Windsurf, Codex, Gemini CLI |
| airis-mcp-gateway | πͺ Unified MCP proxy with 90% token reduction via lazy loading | Claude Code users who want faster startup |
| mindbase (this repo) | πΎ Local cross-session memory with semantic search | Developers who want persistent conversation history |
| airis-workspace | ποΈ Docker-first monorepo manager | Teams building monorepos |
| airiscode | π₯οΈ Terminal-first autonomous coding agent | CLI-first developers |
- airis-mcp-supabase-selfhost - Self-hosted Supabase MCP with RLS support
- mindbase (this repo) - Memory search & storage tools (
mindbase_search,mindbase_store)
MindBase comes pre-configured with AIRIS MCP Gateway. No additional setup required.
# Install the Gateway (includes MindBase)
brew install agiletec-inc/tap/airis-mcp-gateway
# Start the gateway
airis-mcp-gateway up
# Add to Claude Code
claude mcp add --transport http airis-mcp-gateway http://api.gateway.localhost:9400/api/v1/mcpIf you need to run MindBase independently:
git clone https://github.com/agiletec-inc/mindbase.git ~/github/mindbase
cd ~/github/mindbase && make upWhat you get with the full suite:
- β Confidence-gated workflows (prevents wrong-direction coding)
- β Deep research with evidence synthesis
- β 94% token reduction via repository indexing
- β Cross-session memory across all editors
- β Self-review and post-implementation validation
MindBase is the durable conversation memory service that the AIRIS MCP Gateway taps into. AIRIS acts as the gateway and tool orchestrator, while MindBase focuses on storing and retrieving conversations with semantic recall. That separation keeps responsibilities crisp:
- AIRIS MCP Gateway β registers any MCP server (including MindBase), handles the "tool roster" problem by lazily streaming tool descriptions to the LLM, and keeps overall context windows under control.
- MindBase β provides the
mindbase_searchandmindbase_storeMCP tools plus an HTTP API. It runs as a Mac-friendly Docker stack so your editors, agents, and AIRIS can persist or query conversations without touching the cloud.
Because AIRIS only loads tool instructions when the model actually chooses MindBase, you avoid the hot-load issue where twenty richly documented tools explode the prompt budget. MindBase simply exposes concise capabilities; AIRIS decides when and how to surface them.
- Unified timeline β Collectors pull logs from editors, desktop clients, terminal agents, and any bespoke transcripts. Everything lands in one ordered ledger so you can replay how a project evolved across assistants.
- Project & topic intelligence β Stored metadata keeps conversations grouped by project, topic, and source. You can trace a task from brainstorming in Claude Desktop to implementation inside Cursor without manual tagging.
- Semantic memory β Messages are embedded locally through Ollamaβs qwen3-embedding:8b model and stored in pgvector. MindBase becomes the long-term recall layer for AIRIS tools, MCP servers, or any downstream automation.
- Local-first privacy β Conversations reside only inside your Dockerized PostgreSQL volume. Nothing writes to
~/Library/Application Supportor remote services, so your chat history never leaks into cloud sync folders.
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Menubar App (Electron) β¨ NEW: Auto-Collection β
β - File system watcher (FSEvents) for conversation directories β
β - Auto-triggers collectors on new conversation detection β
β - Toggle: β Auto-Collection Enabled β
β - Health monitoring with status indicators (π’π‘π΄) β
βββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Auto-runs Python collectors
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Collectors (Python) β
β - Cursor / Windsurf / VS Code logs β
β - Claude Desktop / Claude Code exports β
β - ChatGPT / Gemini / terminal agent transcripts β
β - Custom ingestion scripts (`collectors/`) β
βββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β JSON payloads β /conversations/store
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MindBase FastAPI (`apps/api`) β
β - POST /conversations/store β
β - POST /conversations/search β
β - GET /health β
βββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β SQL + vector writes
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β PostgreSQLΒ 17 + pgvector (Docker volume `postgres_data_dev`) β
β - Structured metadata (projects, topics, sources, timestamps) β
β - Embedding vectors (1024βdim) β
βββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β local embedding calls
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Ollama (qwen3-embedding:8b) β
β - Fully on-device β
β - Runs free of charge β
βββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MCP tool bridge
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β AIRIS MCP Gateway β
β - Registers MindBase tools β
β - Streams tool descriptions lazily to LLMs β
β - Keeps prompt context efficient β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# 1. Clone (adjust the destination if needed)
git clone https://github.com/kazukinakai/mindbase.git ~/github/mindbase
cd ~/github/mindbase
# 2. Copy environment defaults
cp .env.example .env
# 3. Boot Postgres + API + (optional) Ollama container
make up
# 4. Download the embedding model once (~5β10 minutes)
make model-pull
# 5. Apply database migrations
make migrate
# 6. Check service health
make health # API lives at http://localhost:18002
# 7. (Optional) Run the raw derivation worker
make workerAll conversation data lives in the PostgreSQL volume declared in docker-compose.yml (postgres_data_dev). Remove that volume when you want a clean slate; nothing is written to random App Support folders.
A lightweight Electron menu bar app that automatically collects AI conversations from Claude Code, Cursor, Windsurf, and ChatGPT. It monitors conversation directories and triggers collectors on file changes.
Features:
- Auto-Collection Toggle: Enable/disable via menu bar (β Auto-Collection Enabled)
- File System Watcher: Monitors
~/.claude/,~/.cursor/,~/Library/Application Support/Windsurf/, etc. - Auto-Collector Execution: Runs Python collectors when new conversations are detected
- Health Monitoring: Shows API, database, and Ollama status (π’π‘π΄)
- Quick Commands: One-click
make up/down/logs/worker
Setup:
cd apps/menubar
pnpm install # first run only
pnpm dev # or from root: pnpm dev:menubarLook for the MindBase icon in your macOS menu bar. Click "Auto-Collection Disabled" to toggle ON. The watcher will start monitoring conversation directories and automatically run collectors when new files are detected.
Use Settings⦠to update the API base URL, workspace root, repository path, and custom collector definitions (changes sync to API via /settings endpoint).
Store a conversation
curl -X POST http://localhost:18002/conversations/store \
-H "Content-Type: application/json" \
-d '{
"source": "cursor",
"title": "Fix flaky CI pipeline",
"project": "platform",
"topic": "deployments",
"occurred_at": "2025-10-30T08:45:00Z",
"content": {
"messages": [
{"role": "user", "content": "Why is the staging deploy stuck?"},
{"role": "assistant", "content": "Investigating build logs..."}
]
},
"metadata": {
"editor": "Cursor",
"branch": "fix-staging-deploy"
}
}'Run a semantic search
curl -X POST http://localhost:18002/conversations/search \
-H "Content-Type: application/json" \
-d '{
"query": "autonomous PM agent reflection pattern",
"limit": 8,
"threshold": 0.75,
"source": "all",
"project": "superclaude"
}'Responses include similarity scores, timestamps, metadata, and the original messages so AIRIS (or any client) can immediately cite the relevant history.
- AIRIS MCP Gateway β Run
make installinside the AIRIS repo (orsuperclaude install) to exposemindbase_searchandmindbase_store. The gateway advertises only the selected tool to the LLM, keeping prompts minimal while MindBase delivers embeddings and payloads on demand. - Collectors (
collectors/) β Python scripts and templates that read local caches (Claude, Cursor, ChatGPT, Windsurf, etc.), normalize them, and push JSON to/conversations/store. Extend them to cover any other assistant or internal agent you run. - Processors & Generators (
libs/) β TypeScript utilities that transform stored conversations into knowledge packs, retrospectives, or other artifacts. They demonstrate how MindBase can power downstream workflows. - Schema & migrations (
supabase/) β SQL migrations for PostgreSQL live here so the same schema can be applied locally (via Docker) or to a managed Postgres instance. There is no hosted Supabase dependency; the directory simply keeps schema history tidy.
- Conversations, embeddings, and metadata stay inside the Docker-managed Postgres volume. Stop the stack with
make downand the data remains encrypted on disk via PostgreSQL. - All embeddings are generated locally through Ollama, so you never leak prompts or code to third-party APIs.
- If you need off-device backups, dump the database (
pg_dump) or replicate the Docker volume; there is no hidden shadow copy under~/Library/Application Support.
- More collectors for enterprise chat / ticketing tools.
- Automated summarization and recap views for long-running projects.
- MCP-side incremental recall so AIRIS can page in only the slices of memory that the LLM asks for.
- Fine-grained retention and redaction policies per source.
- Export pipelines that turn curated threads into blogs, books, or playbooks.
- Fork the repo and create a feature branch.
- Follow the style guides in
docs/,AGENTS.md, andCLAUDE.md. - Run the validations before opening a PR:
make lint make health pnpm lint
- Open the PR with a Conventional Commit title, describe the change, list the commands you ran, and attach API traces or screenshots when relevant.
Explore other tools in the AIRIS ecosystem:
- airis-mcp-gateway - Unified MCP hub with 90% token reduction
- airis-agent - Intelligence layer for AI coding (confidence checks, deep research)
- airis-mcp-supabase-selfhost - Self-hosted Supabase MCP with RLS support
- airis-workspace - Docker-first monorepo manager
- cmd-ime - macOS IME switcher (Cmd key toggle)
- neural - Local LLM translation tool (DeepL alternative)
- airiscode - Terminal-first autonomous coding agent
If you find MindBase helpful, consider supporting its development:
Your support helps maintain and improve all AIRIS projects!
MindBase is released under the MIT License.
Built with β€οΈ by the Agiletec team