A powerful Model Context Protocol (MCP) server that provides persistent, intelligent memory using Elasticsearch with hierarchical categorization and semantic search capabilities.
-
π·οΈ Hierarchical Memory Categorization
- 5 category types:
identity
,active_context
,active_project
,technical_knowledge
,archived
- Automatic category detection with confidence scoring
- Manual reclassification support
- 5 category types:
-
π€ Intelligent Auto-Detection
- Accumulative scoring system (0.7-0.95 confidence range)
- 23+ specialized keyword patterns
- Context-aware categorization
-
π¦ Batch Review System
- Review uncategorized memories in batches
- Approve/reject/reclassify workflows
- 10x faster than individual categorization
-
π Backward Compatible Fallback
- Seamlessly loads v5 uncategorized memories
- No data loss during upgrades
- Graceful degradation
-
π Optimized Context Loading
- Hierarchical priority loading (~30-40 memories vs 117)
- 60-70% token reduction
- Smart relevance ranking
-
πΎ Persistent Memory
- Vector embeddings for semantic search
- Session management with checkpoints
- Conversation snapshots
Install directly from PyPI:
pip install elasticsearch-memory-mcp
- Python 3.8+
- Elasticsearch 8.0+
# Using Docker (recommended)
docker run -d -p 9200:9200 -e "discovery.type=single-node" elasticsearch:8.0.0
# Or install locally
# https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html
Add to ~/.config/Claude/claude_desktop_config.json
:
{
"mcpServers": {
"elasticsearch-memory": {
"command": "uvx",
"args": ["elasticsearch-memory-mcp"],
"env": {
"ELASTICSEARCH_URL": "http://localhost:9200"
}
}
}
}
Note: If you don't have
uvx
, install withpip install uvx
or usepython -m elasticsearch_memory_mcp
instead.
claude mcp add elasticsearch-memory uvx elasticsearch-memory-mcp \
-e ELASTICSEARCH_URL=http://localhost:9200
If you want to contribute or modify the code:
# Clone repository
git clone https://github.com/fredac100/elasticsearch-memory-mcp.git
cd elasticsearch-memory-mcp
# Create virtual environment
python3 -m venv venv
source venv/bin/activate
# Install in development mode
pip install -e .
Then configure MCP pointing to your local installation:
{
"mcpServers": {
"elasticsearch-memory": {
"command": "/path/to/venv/bin/python",
"args": ["-m", "mcp_server"],
"env": {
"ELASTICSEARCH_URL": "http://localhost:9200"
}
}
}
}
Save a new memory with automatic categorization.
{
"content": "Fred prefers direct, brutal communication style",
"type": "user_profile",
"importance": 9,
"tags": ["communication", "preference"]
}
Loads hierarchical context with:
- Identity memories (who you are)
- Active context (current work)
- Active projects (ongoing)
- Technical knowledge (relevant facts)
Review uncategorized memories in batches.
{
"batch_size": 10,
"min_confidence": 0.6
}
Returns suggestions with auto-detected categories and confidence scores.
Apply categorizations in batch after review.
{
"approve": ["id1", "id2"], // Auto-categorize
"reject": ["id3"], // Skip
"reclassify": {"id4": "archived"} // Force category
}
Semantic search with filters.
{
"query": "SAE project details",
"limit": 5,
"category": "active_project"
}
Batch auto-categorize uncategorized memories.
{
"max_to_process": 50,
"min_confidence": 0.75
}
βββββββββββββββββββ
β Claude (MCP) β
ββββββββββ¬βββββββββ
β
βΌ
βββββββββββββββββββββββββββββββ
β MCP Server (v6.2) β
β βββββββββββββββββββββββ β
β β Auto-Detection β β
β β - Keyword matching β β
β β - Confidence score β β
β βββββββββββββββββββββββ β
β β
β βββββββββββββββββββββββ β
β β Batch Review β β
β β - Review workflow β β
β β - Bulk operations β β
β βββββββββββββββββββββββ β
ββββββββββββ¬βββββββββββββββββββ
β
βΌ
ββββββββββββββββββββββββββββββββ
β Elasticsearch β
β ββββββββββββββββββββββββββ β
β β memories (index) β β
β β - embeddings (vector) β β
β β - memory_category β β
β β - category_confidence β β
β ββββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββββ
Category | Description | Examples |
---|---|---|
identity | Core identity, values, preferences | "Fred prefers brutal honesty" |
active_context | Current work, recent conversations | "Working on SAE implementation" |
active_project | Ongoing projects | "Mirror architecture design" |
technical_knowledge | Facts, configs, tools | "Elasticsearch index settings" |
archived | Completed, deprecated, old migrations | "Refactored old auth system" |
"Fred prefere comunicaΓ§Γ£o brutal" β identity (0.9)
"RefatoraΓ§Γ£o do sistema SAE concluΓda" β archived (0.85)
"PrΓ³ximos passos: implementar dashboard" β active_context (0.8)
"Fred prefere comunicaΓ§Γ£o brutal. Primeira vez usando este estilo."
β Match 1: "Fred prefere" (+0.9)
β Match 2: "primeira vez" (+0.8)
β Total: 0.95 (normalized)
The v6.2 system includes automatic fallback for v5 memories:
- Uncategorized memories β Loaded via type/tags fallback
- Visual separation β Categorized vs. fallback sections
- Batch review β Categorize old memories efficiently
# Review and categorize v5 memories
review_uncategorized_batch(batch_size=20)
apply_batch_categorization(approve=[...])
- Load initial context: ~10-15s (includes embedding model load)
- Save memory: <1s
- Search: <500ms
- Batch review (10 items): ~2s
- Auto-categorize (50 items): ~5s
# Run quick test
python test_quick.py
# Expected output:
# β
Elasticsearch connected
# β
Context loaded
# β
Identity memories found
# β
Projects separated from fallback
- β Improved auto-detection (0.4 β 0.9 confidence)
- β 23 new specialized keywords
- β Batch review tools (review_uncategorized_batch, apply_batch_categorization)
- β Visual separation (categorized vs fallback)
- β Accumulative confidence scoring
- β Fallback mechanism for uncategorized memories
- β Backward compatibility with v5
- β Memory categorization system
- β Hierarchical context loading
- β Auto-detection with confidence
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Built with Model Context Protocol (MCP)
- Powered by Elasticsearch
- Embeddings by Sentence Transformers
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Made with β€οΈ for the Claude ecosystem