Local-first MCP memory server MVP built with TypeScript, SQLite, and stdio.
- Stores persistent memories in a local SQLite database.
- Exposes
write_memory,read_memory,forget_memory, andexport_memoryas MCP tools. - Supports keyword retrieval by default.
- Supports optional semantic indexing and candidate extraction through an OpenAI-compatible API.
- Tracks append-only memory events and provenance links.
Requirements:
- Node
20+ - npm
Install dependencies:
npm installStart the MCP server:
npm run devStart the HTTP MCP server:
npm run dev:httpBuild and run the compiled server:
npm run build
npm start
npm run start:httpMEMORY_DB_PATHDefault:./.anchormem/memory.dbREMOTE_INFERENCE_MODEAllowed:off,opt-in,alwaysDefault:offOPENAI_COMPAT_BASE_URLExample:https://api.openai.com/v1OPENAI_COMPAT_API_KEYOPENAI_COMPAT_EMBEDDING_MODELOPENAI_COMPAT_EXTRACTION_MODELOPENAI_COMPAT_TIMEOUT_MSDefault:15000ANCHORMEM_HTTP_HOSTDefault:127.0.0.1ANCHORMEM_HTTP_PORTDefault:3000ANCHORMEM_HTTP_MODEAllowed:stateless,stateful,bothDefault:both
Remote inference only runs when both conditions are true:
REMOTE_INFERENCE_MODEisopt-inoralways- the MCP request includes
allowExternalModel: true
Sensitive content detection warns on:
- email-like strings
- phone-like strings
- API key-like strings
- SSN-like strings
If sensitive content is detected and the request did not explicitly opt into external inference, the server will not send that text to an external model.
Example environment for an OpenAI-compatible endpoint:
export REMOTE_INFERENCE_MODE=opt-in
export OPENAI_COMPAT_BASE_URL=https://api.openai.com/v1
export OPENAI_COMPAT_API_KEY=your_api_key
export OPENAI_COMPAT_EMBEDDING_MODEL=text-embedding-3-small
export OPENAI_COMPAT_EXTRACTION_MODEL=gpt-4o-miniAutomated tests keep using stub providers by default. Real API calls are only exercised through manual smoke runs.
Example stdio config:
{
"mcpServers": {
"AnchorMem": {
"command": "node",
"args": ["--import", "tsx", "src/server.ts"],
"cwd": "/Users/rikxiao/source/personal_mem_os",
"env": {
"MEMORY_DB_PATH": "/Users/rikxiao/source/personal_mem_os/.anchormem/memory.db",
"REMOTE_INFERENCE_MODE": "off"
}
}
}
}Stdio smoke with real credentials:
npm run smoke:provider -- --transport=stdioHTTP smoke with real credentials:
npm run dev:http
npm run smoke:provider -- --transport=http --url=http://127.0.0.1:3000/mcpThe smoke flow:
- writes a memory with
allowExternalModel: true - requests extraction candidates
- performs a semantic read
- fails if vector indexing or semantic retrieval did not occur
See /Users/rikxiao/source/personal_mem_os/docs/http-transport.md for route details and stateful/stateless behavior.
See /Users/rikxiao/source/personal_mem_os/docs/sync-architecture.md for the next-phase sync architecture and conflict rules.
The database includes:
memoriesmemory_linksmemory_eventsmemory_embeddingsmemory_fts
The data file defaults to /.anchormem/memory.db relative to the current working directory when the server starts.
npm run devnpm run dev:httpnpm run buildnpm run startnpm run start:httpnpm run smoke:providernpm run testnpm run typecheck
This MVP does not include:
- multi-device sync
- HTTP transport
- UI
- SQLCipher
- cloud storage
- CRDT conflict resolution
- automatic background summarization