Skip to content

A 100% local memory layer for chatbots and agents with an MCP server for Claude, GPT, Gemini, and local models. It auto-saves conversations, ingests documents and markdown vaults, and provides hybrid retrieval (vector + keyword + graph) plus enterprise security (OAuth2, API keys, rate limiting, audit logs) and integrations (Slack import, Notion/GDr

License

Notifications You must be signed in to change notification settings

JustVugg/easymemory

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

easymemory_Logo

Your Memory, Any LLM - Auto-saving conversational memory with MCP support.

┌─────────────────────────────────────────────────────────────┐
│                       EASYMEMORY                            │
├─────────────────────────────────────────────────────────────┤
│                                                             │
│   ┌─────────┐  ┌─────────┐  ┌─────────┐  ┌─────────┐       │
│   │ Claude  │  │   GPT   │  │ Gemini  │  │  Local  │       │
│   └────┬────┘  └────┬────┘  └────┬────┘  └────┬────┘       │
│        │            │            │            │             │
│        └────────────┴─────┬──────┴────────────┘             │
│                           │                                 │
│                    ┌──────▼──────┐                          │
│                    │ MCP Server  │                          │
│                    └──────┬──────┘                          │
│                           │                                 │
│                    ┌──────▼──────┐                          │
│                    │   Memory    │                          │
│                    │   Store     │                          │
│                    └─────────────┘                          │
│                                                             │
└─────────────────────────────────────────────────────────────┘

✨ Features

  • 🔄 Auto-Save: Every conversation automatically saved
  • 🔍 Smart Search: Semantic search across all memories
  • 🧩 Hybrid Retrieval+: Graph + Vector + Keyword + Built-in Local Knowledge Index (no external libs)
  • 📄 Document Support: PDF, DOCX, TXT, Markdown
  • 🔌 MCP Server: Works with Claude, GPT, any MCP-compatible LLM
  • 💾 100% Local: Your data stays on your machine
  • 🏢 Enterprise Security: OAuth2 (Client Credentials), API Keys, rate limit, audit log
  • 🔗 Integrations: Slack JSON import + Notion/GDrive folder indexing
  • 🚀 Easy Setup: One command to start

📦 Installation

# Clone the repo
git clone https://github.com/yourusername/easymemory.git
cd easymemory

# Install in development mode
pip install -e .

🚀 Quick Start

Option 1: MCP Server (for Claude Desktop, GPT, etc.)

# Start the MCP server
easymemory-server --port 8100

Then configure your LLM client to connect to http://localhost:8100/mcp Health checks:

  • http://localhost:8100/healthz
  • http://localhost:8100/readyz

Option 2: Interactive Agent

# With Ollama
easymemory-agent --provider ollama --model llama3.1:8b

# With OpenAI
easymemory-agent --provider openai --model gpt-4

Option 3: Use in Python

import asyncio
from easymemory.agent import EasyMemoryAgent

async def main():
    async with EasyMemoryAgent(
        llm_provider="ollama",
        model="llama3.1:8b"
    ) as agent:
        # Chat - automatically saves everything!
        response = await agent.chat("Hello! Remember that I love Python.")
        print(response)
        
        # Later...
        response = await agent.chat("What do I love?")
        print(response)  # "You mentioned that you love Python!"

asyncio.run(main())

🛠️ MCP Tools

When running as MCP server, these tools are available:

Tool Description
memory_add Save a note/fact/idea
memory_search Search memories (all, conversations, documents, notes, knowledge, hybrid)
memory_add_file Import PDF, DOCX, TXT, MD
memory_index_path Index local markdown/txt vault into internal knowledge index
memory_list List saved memories
memory_delete Delete a specific memory
memory_stats Show statistics

Example for local vault indexing (no external tools):

{"tool":"memory_index_path","path":"/path/to/vault","recursive":true,"pattern":"*.md"}

📁 Data Storage

All data is stored locally in:

  • Linux/Mac: ~/.easymemory/data/
  • Windows: C:\Users\<you>\.easymemory\data\

🔧 Configuration

Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "easymemory": {
      "url": "http://localhost:8100/mcp"
    }
  }
}

Environment Variables

# Optional: Custom data directory
export EASYMEMORY_DATA_DIR=/path/to/data

# Optional: Embedding model (default: BAAI/bge-m3)
export EASYMEMORY_EMBED_MODEL=BAAI/bge-m3

# Optional: MCP server config
export EASYMEMORY_HOST=0.0.0.0
export EASYMEMORY_PORT=8100
export EASYMEMORY_LOG_LEVEL=info

# Optional: LLM resilience
export EASYMEMORY_LLM_TIMEOUT=120
export EASYMEMORY_LLM_MAX_RETRIES=2
export EASYMEMORY_EXTRACT_TIMEOUT=60
export EASYMEMORY_EXTRACT_MAX_RETRIES=1

# Enterprise auth/security
export EASYMEMORY_OAUTH_SECRET=change-me-in-prod
export EASYMEMORY_OAUTH_CLIENTS='{"app-prod":{"secret":"supersecret","tenant_id":"team-prod","roles":["reader","writer"]}}'
export EASYMEMORY_ADMIN_TOKEN=another-secret
export EASYMEMORY_RATE_LIMIT_PER_MIN=180
export EASYMEMORY_IMPORT_ROOTS=/srv/knowledge,/home/user/vault

🏭 Startup / Production Checklist

# 1) Install package + deps
pip install -e .

# 2) Run quick sanity checks
python3 -m compileall src
PYTHONPATH=src python3 -m unittest discover -s tests -v

# 3) Start MCP server
easymemory-server --host 0.0.0.0 --port 8100 --log-level info

# 4) Index your local knowledge vault (optional)
easymemory index --path /path/to/vault --pattern "*.md"
# (alias)
easymemory-index --path /path/to/vault --pattern "*.md"

# 5) Run LoCoMo-style benchmark
easymemory-locomo --provider ollama --model gpt-oss:120b-cloud

# 6) Run "proof" benchmark (single-hop, multi-hop, adversarial)
easymemory prove --profiles 80 --seed 7
# (alias)
easymemory-prove --profiles 80 --seed 7

Recommended deployment checks:

  • Use /healthz for liveness probes
  • Use /readyz for readiness probes
  • Persist ~/.easymemory/data on durable volume
  • Set provider/model via environment for reproducible runs

🔐 OAuth2 / Enterprise API

EasyMemory usa OAuth2 Client Credentials.

# 1) token
curl -X POST http://localhost:8100/oauth/token \
  -d "grant_type=client_credentials" \
  -d "client_id=app-prod" \
  -d "client_secret=supersecret" \
  -d "scope=memory:read memory:write"

# 2) query enterprise API
curl -X POST http://localhost:8100/v1/search \
  -H "Authorization: Bearer <ACCESS_TOKEN>" \
  -H "Content-Type: application/json" \
  -d '{"query":"project notes","n_results":10,"search_type":"hybrid"}'

Enterprise endpoints:

  • POST /v1/notes
  • POST /v1/search
  • POST /v1/index
  • GET /v1/stats
  • POST /v1/integrations/slack/import
  • POST /v1/integrations/notion/import
  • POST /v1/integrations/gdrive/import
  • POST /admin/api-keys (header X-Admin-Token)
  • GET /admin/api-keys (header X-Admin-Token)

✅ GitHub Ready

  • CI workflow included: .github/workflows/ci.yml (Python 3.10/3.11/3.12)
  • License included: LICENSE (MIT)
  • Clean repo defaults: .gitignore for caches/build/local data

📊 Architecture

easymemory/
├── src/easymemory/
│   ├── core/
│   │   ├── memory_store.py   # ChromaDB + embeddings
│   │   └── ingestion.py      # Document processing
│   ├── server.py             # MCP server facade
│   ├── web_ui.py             # MCP tools implementation
│   ├── integration.py        # Gradio Web UI
│   ├── agent.py              # Orchestrator agent
│   └── main.py               # CLI entry points
├── pyproject.toml
└── README.md

📄 License

MIT License - Use it however you want!


Made with 🧠 by the EasyMemory Team

About

A 100% local memory layer for chatbots and agents with an MCP server for Claude, GPT, Gemini, and local models. It auto-saves conversations, ingests documents and markdown vaults, and provides hybrid retrieval (vector + keyword + graph) plus enterprise security (OAuth2, API keys, rate limiting, audit logs) and integrations (Slack import, Notion/GDr

Topics

Resources

License

Stars

Watchers

Forks

Languages