Standalone semantic memory and RAG service for AI coding assistants.
Linggen Memory (ling-mem) is a local-first memory engine that provides semantic search, code indexing, vector storage (via LanceDB), and an MCP server. It works as a standalone service for any AI agent — Claude Code, Codex, Cursor, or your own tools.
- Semantic Code Search: Index your codebase and search by meaning, not just keywords.
- Design Memory: Store architectural decisions, ADRs, and tribal knowledge in
.linggen/memory/as Markdown. AI retrieves them via semantic search. - System Map: Obsidian-like dependency graph visualization of file relationships.
- Shared Library & Skills: Pre-defined skills (Software Architect, Senior Developer, etc.) for consistent AI behavior.
- MCP Server: Model Context Protocol endpoint at
/mcp/ssefor MCP-enabled IDEs. - Local-First: All indexing and vector search happens on your machine. Nothing leaves your side.
curl -fsSL https://linggen.dev/install-mem.sh | bashOr install a specific version:
curl -fsSL https://linggen.dev/install-mem.sh | bash -s -- --version v0.7.0ling-mem update# Start the server (default port 8787)
ling-mem serve
# Start as a background daemon
ling-mem serve --daemon
# Index a codebase
ling-mem index .
# Index with options
ling-mem index /path/to/project --mode full --name my-project
# Check status
ling-mem status
# Stop daemon
ling-mem stop
# Self-update to latest
ling-mem updateOpen http://localhost:8787 in your browser for the built-in Web UI with graph visualization, search, and memory management.
Linggen Memory works as a standalone service that any AI can connect to.
Install the memory skill:
# The linggen skill connects your AI to the memory server
ling init --globalExample prompts:
"Search Linggen memory for architectural decisions about our database choice."
"Index this project with Linggen and find all authentication-related code."
Add to your MCP config:
{
"mcpServers": {
"linggen-memory": {
"url": "http://localhost:8787/mcp/sse"
}
}
}All functionality is available via REST:
# Search
curl "http://localhost:8787/api/search?q=authentication&limit=5"
# List indexed sources
curl "http://localhost:8787/api/resources"
# Server status
curl "http://localhost:8787/api/status"If you use Linggen Agent (ling), it manages ling-mem for you:
ling memory start # starts ling-mem daemon
ling memory stop # stops it
ling memory status # checks status
ling memory index . # indexes via ling-memYou can also install ling-mem via:
ling install --memory| Project | Description |
|---|---|
| linggen-memory | Semantic memory engine, RAG backend, MCP server (this repo) |
| linggen-agent | Multi-agent coding assistant with TUI and Web UI |
| linggen-vscode | VS Code extension for graph view and MCP setup |
Linggen Memory is a Rust workspace with 11 crates:
| Crate | Purpose |
|---|---|
api |
HTTP API server, MCP endpoint, embedded Web UI (ling-mem binary) |
core |
Core domain types |
ingestion |
File/codebase indexing pipeline |
embeddings |
Local embedding model management |
storage |
LanceDB vector storage |
context |
Context assembly for prompts |
enhancement |
Prompt enhancement |
architect |
Architecture analysis |
intent |
Intent detection |
llm |
LLM provider abstraction |
mcp-server |
Legacy stdio MCP server (deprecated) |
Frontend: React 19 + TypeScript + Vite + Tailwind v4 with CodeMirror, Cytoscape graph, and Mermaid diagrams.
# Build frontend
cd frontend && npm ci && npm run build && cd ..
# Build backend (embeds frontend via rust-embed)
cd backend && cargo build --release --bin ling-memThe binary is at backend/target/release/ling-mem.
# Build for current platform
./scripts/build.sh v0.7.0 --skip-linux
# Full release (build + sign + upload to GitHub)
./scripts/release.sh v0.7.0See RELEASES.md for the full release process.
Linggen is open-source under the MIT License.
- Free for individuals: All personal and open-source use.
- Commercial support: For teams (5+ users), see our Pricing Page.
MIT (c) 2026 Linggen