A graph-based memory system for LLMs with intelligent retrieval. MemoGraph provides a powerful solution to the LLM memory problem by combining knowledge graphs, hybrid retrieval, and semantic search.
- π€ Smart Auto-Organization Engine: Automatically extract structured information from memories using LLMs
- Topics, subtopics, and recurring themes
- People with roles and organizations
- Action items with assignees and deadlines
- Decisions, questions, and sentiment analysis
- Risks, ideas, and timeline events
- Graph-Based Memory: Navigate knowledge using bidirectional wikilinks and backlinks
- Hybrid Retrieval: Combines keyword matching, graph traversal, and optional vector embeddings
- Markdown-Native: Human-readable markdown files with YAML frontmatter
- Memory Types: Support for episodic, semantic, procedural, and fact-based memories
- Smart Indexing: Efficient caching system that only re-indexes changed files
- CLI & Python API: Use via command line or integrate into your Python applications
- Multiple LLM Providers: Works with Ollama, Claude, and OpenAI
- Context Compression: Intelligent token budgeting for optimal context windows
- Salience Scoring: Memory importance ranking for better retrieval
pip install memographInstall with optional dependencies:
# For OpenAI support
pip install memograph[openai]
# For Anthropic Claude support
pip install memograph[anthropic]
# For Ollama support
pip install memograph[ollama]
# For embedding support
pip install memograph[embeddings]
# Install everything
pip install memograph[all]from memograph import MemoryKernel, MemoryType
# Initialize the kernel attached to your vault path
kernel = MemoryKernel("~/my-vault")
# Ingest all notes in the vault
stats = kernel.ingest()
print(f"Indexed {stats['indexed']} memories.")
# Programmatically add a new memory
kernel.remember(
title="Meeting Note",
content="Decided to use BFS graph traversal for retrieval.",
memory_type=MemoryType.EPISODIC,
tags=["design", "retrieval"]
)
# Retrieve context for an LLM query
context = kernel.context_window(
query="how does retrieval work?",
tags=["retrieval"],
depth=2,
top_k=8
)
print(context)MemoGraph includes a full-featured MCP server for seamless integration with AI assistants like Cline and Claude Desktop.
| Category | Tools | Description |
|---|---|---|
| Search | search_vault, query_with_context |
Semantic search and context retrieval |
| Create | create_memory, import_document |
Add memories and import documents |
| Read | list_memories, get_memory, get_vault_info |
Browse and retrieve memories |
| Update | update_memory |
Modify existing memories |
| Delete | delete_memory |
Remove memories by ID |
| Analytics | get_vault_stats |
Vault statistics and insights |
| Discovery | list_available_tools |
List all available tools |
| Autonomous | auto_hook_query, auto_hook_response, configure_autonomous_mode, get_autonomous_config |
Autonomous memory management |
| Graph | relate_memories, search_by_graph, find_path |
Graph-native linking and traversal |
| Bulk | bulk_create |
Create multiple memories in one call |
Add to your ~/.cline/mcp_settings.json:
{
"mcp": {
"servers": {
"memograph": {
"command": "python",
"args": ["-m", "memograph.mcp.run_server"],
"env": {
"MEMOGRAPH_VAULT": "/path/to/your/vault"
}
}
}
}
}Add to your claude_desktop_config.json:
{
"mcpServers": {
"memograph": {
"command": "python",
"args": ["-m", "memograph.mcp.run_server", "--vault", "/path/to/your/vault"]
}
}
}NEW: MemoGraph is now available in the official MCP Registry!
# Install via MCP CLI (if available)
mcp install io.github.indhar01/memograph
# Or manually configure in your MCP client:{
"mcpServers": {
"memograph": {
"command": "python",
"args": ["-m", "memograph.mcp.run_server"],
"env": {
"MEMOGRAPH_VAULT": "~/my-vault"
}
}
}
}Benefits of MCP Registry:
- β Official registry backed by Anthropic, GitHub, and Microsoft
- β Automatic version updates from PyPI
- β Discoverable by all MCP-compatible clients
- β Verified and trusted installation
See MCP_REGISTRY_GUIDE.md for complete submission and configuration guide.
Once configured, use natural language with your AI assistant:
"Search my vault for memories about Python"
"Create a memory titled 'Project Ideas' with content '...'"
"Update memory abc-123 to have salience 0.9"
"Delete memory xyz-456"
"What tools are available?"
"Get vault statistics"
See CONFIG_REFERENCE.md for complete MCP configuration guide.
MemoGraph comes with a powerful CLI for managing your vault and chatting with it.
Index your markdown files into the graph database:
memograph --vault ~/my-vault ingestForce re-indexing all files:
memograph --vault ~/my-vault ingest --forceQuickly add a memory from the command line:
memograph --vault ~/my-vault remember \
--title "Team Sync" \
--content "Discussed Q3 goals." \
--tags planning q3Generate context for a query:
memograph --vault ~/my-vault context \
--query "What did we decide about the database?" \
--tags architecture \
--depth 2 \
--top-k 5Start an interactive chat session with your vault context:
memograph --vault ~/my-vault ask --chat --provider ollama --model llama3Or ask a single question:
memograph --vault ~/my-vault ask \
--query "Summarize our design decisions" \
--provider claude \
--model claude-3-5-sonnet-20240620Check your environment and connection to LLM providers:
memograph --vault ~/my-vault doctorMemoGraph supports different types of memories inspired by cognitive science:
- Episodic: Personal experiences and events (e.g., meeting notes)
- Semantic: Facts and general knowledge (e.g., documentation)
- Procedural: How-to knowledge and processes (e.g., tutorials)
- Fact: Discrete factual information (e.g., configuration values)
The library uses BFS (Breadth-First Search) to traverse your knowledge graph:
# Retrieve nodes with depth=2 (2 hops from seed nodes)
nodes = kernel.retrieve_nodes(
query="graph algorithms",
depth=2, # Traverse up to 2 levels deep
top_k=10 # Return top 10 relevant memories
)Each memory has a salience score (0.0-1.0) that represents its importance:
---
title: "Critical Architecture Decision"
salience: 0.9
memory_type: semantic
---
We decided to use PostgreSQL for better ACID guarantees...MemoGraph/
βββ memograph/ # Main package
β βββ core/ # Core functionality
β β βββ kernel.py # Memory kernel
β β βββ graph.py # Graph implementation
β β βββ retriever.py # Hybrid retrieval
β β βββ indexer.py # File indexing
β β βββ parser.py # Markdown parsing
β βββ adapters/ # LLM and embedding adapters
β β βββ embeddings/ # Embedding providers
β β βββ frameworks/ # Framework integrations
β β βββ llm/ # LLM providers
β βββ storage/ # Storage and caching
β βββ mcp/ # MCP server implementation
β βββ cli.py # CLI implementation
βββ tests/ # Test suite
βββ examples/ # Example usage
βββ scripts/ # Utility scripts
We welcome contributions! Please see our Contributing Guide for details.
-
Clone the repository:
git clone https://github.com/Indhar01/MemoGraph.git cd MemoGraph -
Install in development mode:
pip install -e ".[all,dev]" -
Install pre-commit hooks:
pre-commit install
-
Run tests:
pytest
We maintain high code quality standards:
- Linting: Ruff for fast Python linting
- Formatting: Ruff formatter for consistent code style
- Type Checking: MyPy for static type analysis
- Testing: Pytest with comprehensive test coverage
- Pre-commit Hooks: Automated checks before each commit
- MCP Registry Guide - Publishing to official MCP Registry
- AGENTS.md - Guide for AI agents working with this codebase
- Contributing Guide - How to contribute to the project
- Code of Conduct - Community guidelines
- Security Policy - Security reporting and best practices
- Changelog - Version history and changes
See our Security Policy for reporting vulnerabilities.
This project is licensed under the MIT License - see the LICENSE file for details.
Inspired by the need for better memory management in LLM applications. Built with:
- Graph-based knowledge representation
- Hybrid retrieval strategies
- Cognitive science principles
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Current Version: 0.1.0 (Alpha - Marketplace Ready)
This project is in active development with a focus on code quality and stability:
- β Core functionality is stable and tested
- β All linter checks passing (Ruff)
- β Type checking configured (MyPy)
- β Pre-commit hooks enabled
- β Comprehensive test suite
β οΈ API may change in minor versions until v1.0.0
Recent Improvements:
- π Published to official MCP Registry (io.github.indhar01/memograph)
- Enhanced code quality with Ruff linting and formatting
- Added comprehensive type checking with MyPy
- Improved project structure and organization
- Updated MCP server with 19 tools including autonomous features and graph operations
- Added AGENTS.md for AI assistant integration
- Created comprehensive MCP Registry submission guide
Made with β€οΈ for better LLM memory management