Skip to content

getobyte/NeuralVaultCore

Repository files navigation

⚡ Neural Vault Core

Infinite Long-Term Memory for AI Agents

Cyber-Draco Legacy · Created by Claude Opus 4.6 (Anthropic) · Engineered by getobyte


MIT License Python 3.10+ MCP Server Docker Ready


One memory vault. Every AI tool. Persistent across sessions.

Neural Vault Core is an MCP memory server that gives any AI agent — Claude Code, Cursor, OpenCode, Ollama, ChatGPT, or anything with MCP support — persistent long-term memory. Store a memory from Claude Code, recall it from Cursor. Save context on your homelab, access it from your laptop. Every AI tool shares the same brain.


Why Neural Vault Core?

  • AI agents forget everything between sessions. Every conversation starts from zero. Context is lost, decisions are repeated, knowledge evaporates.
  • Context windows are expensive and finite. You can't paste your entire project history into every prompt.
  • You need ONE memory vault accessible from ANY AI tool. Not a different memory per IDE.
  • Neural Vault Core solves this. A single, fast, searchable memory server that every AI tool on your network connects to.

Features

  • 4 MCP Tools — store, retrieve, search, delete (optimized for minimal token overhead)
  • Auto-versioning — every update saves the previous version (last 5 edit history kept per memory)
  • Permanent storage — memories never expire, never auto-delete. They stay until YOU delete them.
  • Full-text search — SQLite FTS5 with phrase search and LIKE fallback
  • Namespaces — organize memories by project, context, or category
  • Bearer token auth — constant-time comparison on SSE transport
  • Docker-ready — one command to deploy on your homelab LAN
  • CLI includednvc command-line tool with 12 subcommands for direct management
  • Zero bloat — only 2 dependencies: fastmcp and python-dotenv

Installation (Step by Step)

Best deployed on a homelab or server for always-on access from any device on your network. Can also run locally on a single machine.

Step 1 — Clone the repo

git clone https://github.com/getobyte/NeuralVaultCore.git
cd NeuralVaultCore

Step 2 — Generate your API key

You need one secret key. All your AI tools will use this same key to connect.

On Windows (PowerShell):

python -c "import secrets; print('nvc_' + secrets.token_hex(24))"

On Linux/macOS:

python3 -c "import secrets; print('nvc_' + secrets.token_hex(24))"

This prints something like: nvc_a3f8b2c1d4e5f67890abcdef12345678abcdef123456

Copy it. You'll need it in the next step and later for your IDE config.

Step 3 — Create your .env file

cp .env.example .env

Open .env in any text editor and replace the placeholder with your key:

NVC_API_KEY=nvc_a3f8b2c1d4e5f67890abcdef12345678abcdef123456

Step 4 — Start the server

With Docker (recommended):

docker compose up -d --build

That's it. Server is running on port 9999.

Verify it's working:

docker compose ps

You should see neural-vault-core with status Up and (healthy).

Without Docker (local):

python install.py

The installer creates a venv, installs deps, initializes the DB, and sets everything up.

Then run:

python server.py --transport sse --host 0.0.0.0 --port 9999

Step 5 — Find your server IP (for LAN access)

If you're running on a homelab/server and connecting from other machines:

Windows:

ipconfig

Look for IPv4 Address — something like 192.168.1.50

Linux:

hostname -I

If running locally and connecting from the same machine, use 127.0.0.1.


Connect Your AI Tools

Neural Vault Core is configured once as a global MCP server. Every IDE, every CLI, every AI tool connects to the same vault using the same key.

You need two things for every tool:

  • URL: http://YOUR_IP:9999/sse (replace YOUR_IP with your server's IP from Step 5)
  • Key: The nvc_... key you generated in Step 2

Claude Code (in VSCode)

  1. Open VSCode
  2. Press Ctrl+Shift+P (or Cmd+Shift+P on Mac)
  3. Type "Preferences: Open User Settings (JSON)" and select it
  4. Find or create the "mcpServers" section and add:
{
  "mcpServers": {
    "neural-vault-core": {
      "url": "http://192.168.1.50:9999/sse",
      "headers": {
        "Authorization": "Bearer nvc_YOUR_KEY_HERE"
      }
    }
  }
}
  1. Replace 192.168.1.50 with your server IP
  2. Replace nvc_YOUR_KEY_HERE with your actual key from Step 2
  3. Restart VSCode

To test: Open Claude Code and type: "Store a memory with key test and content hello world" If you see OK store test [default] 11ch — it works.


Claude Code (CLI / Terminal)

Edit ~/.claude.json (or create it):

Windows path: %USERPROFILE%\.claude.json

{
  "mcpServers": {
    "neural-vault-core": {
      "url": "http://192.168.1.50:9999/sse",
      "headers": {
        "Authorization": "Bearer nvc_YOUR_KEY_HERE"
      }
    }
  }
}

Restart Claude Code CLI. Every new session now has access to your memory vault.


Cursor

  1. Open Cursor
  2. Go to SettingsMCP ServersAdd Server
  3. Add this config:
{
  "mcpServers": {
    "neural-vault-core": {
      "url": "http://192.168.1.50:9999/sse",
      "headers": {
        "Authorization": "Bearer nvc_YOUR_KEY_HERE"
      }
    }
  }
}
  1. Replace IP and key
  2. Restart Cursor

OpenCode

Add to your opencode.json:

{
  "mcpServers": {
    "neural-vault-core": {
      "url": "http://192.168.1.50:9999/sse",
      "headers": {
        "Authorization": "Bearer nvc_YOUR_KEY_HERE"
      }
    }
  }
}

Any Other MCP Client

Any MCP-compatible tool can connect using:

URL:    http://YOUR_IP:9999/sse
Header: Authorization: Bearer nvc_YOUR_KEY_HERE

Local-only Mode (stdio, no auth needed)

If you're running NVC on the same machine as your IDE (no Docker, no network), you can use stdio transport instead. No API key needed — local process = implicit trust.

Windows:

{
  "mcpServers": {
    "neural-vault-core": {
      "command": "C:\\path\\to\\NeuralVaultCore\\venv\\Scripts\\python.exe",
      "args": ["C:\\path\\to\\NeuralVaultCore\\server.py"]
    }
  }
}

Linux/macOS:

{
  "mcpServers": {
    "neural-vault-core": {
      "command": "/path/to/NeuralVaultCore/venv/bin/python",
      "args": ["/path/to/NeuralVaultCore/server.py"]
    }
  }
}

How It Works

Once connected, your AI agent has 4 tools:

Tool What it does
store_memory Save or update a memory. Previous content auto-saved as version history.
retrieve_memory Get a specific memory by its exact key.
search_memories Search all memories. Empty query = list everything.
delete_memory Permanently delete a memory and its version history.

Memories are permanent. They never expire, never auto-delete. Only delete_memory removes them.

Versioning is edit history. When you update a memory, the old content is kept (last 5 edits). The memory itself stays forever.

Example workflow:

You're working in Claude Code on a project:

"Store a memory with key webrapid-stack, namespace projects, content: Fastify v5, TypeScript, Prisma, PostgreSQL, React, Tailwind, shadcn/ui"

Next week, different project, different tab:

"What's my stack for webrapid?"

The AI searches your vault, finds it, knows your stack. No re-explaining. Ever.


CLI Reference

Neural Vault Core includes a full CLI (nvc) for direct management from your terminal. The CLI has all features including versioning, stats, and export that aren't exposed to AI agents.

nvc store my-key "Content here" --tags "tag1,tag2" --title "Title" --ns project
nvc get my-key
nvc search "query" --ns project
nvc list --ns project
nvc delete my-key
nvc delete my-key -y              # skip confirmation
nvc versions my-key               # view edit history
nvc restore my-key 3              # restore version 3
nvc namespaces                    # list all namespaces
nvc export output.json            # export all memories
nvc import input.json             # import from JSON
nvc stats                         # storage statistics
nvc migrate path/to/json/dir      # migrate from ContextKeep

Pipe content from stdin:

cat notes.txt | nvc store my-notes - --title "My Notes"

Architecture

NeuralVaultCore/
├── core/
│   ├── config.py       ← Centralized config (env loading, defaults, validation)
│   ├── models.py       ← Typed dataclasses (Memory, Version, StorageStats)
│   ├── auth.py         ← API key generation, validation, FastMCP middleware
│   └── storage.py      ← Abstract protocol + SQLite implementation
├── server.py           ← MCP server (thin wiring — tools delegate to storage)
├── nvc.py              ← CLI tool (12 subcommands)
├── install.py          ← Installer wizard
├── Dockerfile
├── docker-compose.yml
└── .env.example

Design principles: clean separation (server.py has zero business logic), typed everywhere (dataclasses, not dicts), abstract storage (swap SQLite for PostgreSQL by implementing one class), clean content (store never modifies what you send).


Security

  • API keys: secrets.token_hex(24) with nvc_ prefix (48 hex chars of entropy)
  • Validation: secrets.compare_digest — constant-time, prevents timing attacks
  • Auth middleware: runs on every MCP operation over SSE transport
  • stdio: skips auth (local process = implicit trust)
  • SQL: all queries parameterized — zero injection surface
  • Docker: runs as non-root user (UID 1000)
  • Server refuses to start in SSE mode without an API key

Storage Limits

Field Max Size
Key 256 chars
Title 512 chars
Content 1 MB
Tags 1,024 chars
Versions kept 5 per key (edit history, not memory lifetime)
Search results 50 per query

Ports

Port Purpose
9999 MCP SSE server (default)
9998 Reserved for future use

Credits


Cyber-Draco Legacy

Neural Vault Core was conceived by getobyte and brought to life by Claude Opus 4.6 (Anthropic).

The entire codebase — architecture, storage engine, auth system, CLI, Docker setup — was written by an AI, engineered and directed by a human. This is what happens when you let the machine cook. 🔥


License

MIT License — Copyright (c) 2025-2026 getobyte