MemoryOS is designed to provide a memory operating system for personalized AI agents, enabling more coherent, personalized, and context-aware interactions. Drawing inspiration from memory management principles in operating systems, it adopts a hierarchical storage architecture with four core modules: Storage, Updating, Retrieval, and Generation, to achieve comprehensive and efficient memory management. On the LoCoMo benchmark, the model achieved average improvements of 49.11% and 46.18% in F1 and BLEU-1 scores.
- Paper: https://arxiv.org/abs/2506.06326
- Website: https://baijia.online/memoryos/
- Documentation: https://bai-lab.github.io/MemoryOS/docs
- YouTube Video: MemoryOS MCP + RAG Agent That Can Remember Anything
- https://www.youtube.com/watch?v=WHQu8fpEOaU
-
🏆 TOP Performance in Memory Management
The SOTA results in long-term memory benchmarks, boosting F1 scores by 49.11% and BLEU-1 by 46.18% on the LoCoMo benchmark. -
🧠 Plug-and-Play Memory Management Architecture
Enables seamless integration of pluggable memory modules—including storage engines, update strategies, and retrieval algorithms. -
✨ Agent Workflow Create with Ease (MemoryOS-MCP)
Inject long-term memory capabilities into various AI applications by calling modular tools provided by the MCP Server. -
🌐 Universal LLM Support
MemoryOS seamlessly integrates with a wide range of LLMs (e.g., OpenAI, Deepseek, Qwen ...)
- [new] 🔥🔥 [2025-07-09]: 📊 Evaluation of the MemoryOS on LoCoMo Dataset: Publicly Available 👉Reproduce.
- [new] 🔥🔥 [2025-07-08]: 🏆 New Config Parameter
- New parameter configuration: similarity_threshold. For configuration file, see 📖 Documentation page.
- [new] 🔥🔥 [2025-07-07]: 🚀5 Times Faster
- The MemoryOS (PYPI) implementation has been upgraded: 5 times faster (reduction in latency) through parallelization optimizations.
- [new] 🔥🔥 [2025-07-07]: ✨R1 models Support Now
- MemoryOS supports configuring and using inference models such as Deepseek-r1 and Qwen3..
- [new] 🔥🔥 [2025-07-07]: ✨MemoryOS Playground Launched
- The Playground of MemoryOS Platform has been launched! 👉MemoryOS Platform. If you need an Invitation Code, please feel free to reach Contact US.
- [new] 🔥 [2025-06-15]:🛠️ Open-sourced MemoryOS-MCP released! Now configurable on agent clients for seamless integration and customization. 👉 MemoryOS-MCP.
- [2025-05-30]: 📄 Paper-Memory OS of AI Agent is available on arXiv: https://arxiv.org/abs/2506.06326.
- [2025-05-30]: Initial version of MemoryOS launched! Featuring short-term, mid-term, and long-term persona Memory with automated user profile and knowledge updating.
Type | Name | Open Source | Support | Configuration | Description |
---|---|---|---|---|---|
Agent Client | Claude Desktop | ❌ | ✅ | claude_desktop_config.json | Anthropic official client |
Cline | ✅ | ✅ | VS Code settings | VS Code extension | |
Cursor | ❌ | ✅ | Settings panel | AI code editor | |
Model Provider | OpenAI | ❌ | ✅ | OPENAI_API_KEY | GPT-4, GPT-3.5, etc. |
Anthropic | ❌ | ✅ | ANTHROPIC_API_KEY | Claude series | |
Deepseek-R1 | ✅ | ✅ | DEEPSEEK_API_KEY | Chinese large model | |
Qwen/Qwen3 | ✅ | ✅ | QWEN_API_KEY | Alibaba Qwen | |
vLLM | ✅ | ✅ | Local deployment | Local model inference | |
Llama_factory | ✅ | ✅ | Local deployment | Local fine-tuning deployment |
- ✨ Features
- 🔥 News
- 🔍Support Lists
- 📁Project Structure
- 🎯 Quick Start
- ☑️ Todo List
- 🔬 How to Reproduce the Results in the Paper
- 📖 Documentation
- 🌟 Cite

memoryos/
├── __init__.py # Initializes the MemoryOS package
├── __pycache__/ # Python cache directory (auto-generated)
├── long_term.py # Manages long-term persona memory (user profile, knowledge)
├── memoryos.py # Main class for MemoryOS, orchestrating all components
├── mid_term.py # Manages mid-term memory, consolidating short-term interactions
├── prompts.py # Contains prompts used for LLM interactions (e.g., summarization, analysis)
├── retriever.py # Retrieves relevant information from all memory layers
├── short_term.py # Manages short-term memory for recent interactions
├── updater.py # Processes memory updates, including promoting information between layers
└── utils.py # Utility functions used across the library
- Python >= 3.10
- conda create -n MemoryOS python=3.10
- conda activate MemoryOS
pip install memoryos-pro -i https://pypi.org/simple
git clone https://github.com/BAI-LAB/MemoryOS.git
cd MemoryOS/memoryos-pypi
pip install -r requirements.txt
import os
from memoryos import Memoryos
# --- Basic Configuration ---
USER_ID = "demo_user"
ASSISTANT_ID = "demo_assistant"
API_KEY = "YOUR_OPENAI_API_KEY" # Replace with your key
BASE_URL = "" # Optional: if using a custom OpenAI endpoint
DATA_STORAGE_PATH = "./simple_demo_data"
LLM_MODEL = "gpt-4o-mini"
def simple_demo():
print("MemoryOS Simple Demo")
# 1. Initialize MemoryOS
print("Initializing MemoryOS...")
try:
memo = Memoryos(
user_id=USER_ID,
openai_api_key=API_KEY,
openai_base_url=BASE_URL,
data_storage_path=DATA_STORAGE_PATH,
llm_model=LLM_MODEL,
assistant_id=ASSISTANT_ID,
short_term_capacity=7,
mid_term_heat_threshold=5,
retrieval_queue_capacity=7,
long_term_knowledge_capacity=100
)
print("MemoryOS initialized successfully!\n")
except Exception as e:
print(f"Error: {e}")
return
# 2. Add some basic memories
print("Adding some memories...")
memo.add_memory(
user_input="Hi! I'm Tom, I work as a data scientist in San Francisco.",
agent_response="Hello Tom! Nice to meet you. Data science is such an exciting field. What kind of data do you work with?"
)
test_query = "What do you remember about my job?"
print(f"User: {test_query}")
response = memo.get_response(
query=test_query,
)
print(f"Assistant: {response}")
if __name__ == "__main__":
simple_demo()
Saves the content of the conversation between the user and the AI assistant into the memory system, for the purpose of building a persistent dialogue history and contextual record.
Retrieves related historical dialogues, user preferences, and knowledge information from the memory system based on a query, helping the AI assistant understand the user’s needs and background.
Obtains a user profile generated from the analysis of historical dialogues, including the user’s personality traits, interest preferences, and relevant knowledge background.
cd memoryos-mcp
pip install -r requirements.txt
Edit config.json
:
{
"user_id": "user ID",
"openai_api_key": "OpenAI API key",
"openai_base_url": "https://api.openai.com/v1",
"data_storage_path": "./memoryos_data",
"assistant_id": "assistant_id",
"llm_model": "gpt-4o-mini"
}
python server_new.py --config config.json
python test_comprehensive.py
Copy the mcp.json file over, and make sure the file path is correct.
command": "/root/miniconda3/envs/memos/bin/python"
#This should be changed to the Python interpreter of your virtual environment
cd eval
Configure API keys and other settings in the code
python3 main_loco_parse.py
python3 evalution_loco.py
MemoryOS is continuously evolving! Here's what's coming:
- Ongoing🚀 Parallelization Acceleration of MemoryOS-MCP
- Ongoing🚀: Support for Embedding models
- Ongoing🚀: Support for vector database, graph database
- Ongoing🚀: Support for faiss-cpu
- Ongoing🚀: The deployment methods of Docker
- Ongoing🚀: Integrated Benchmarks: Standardized benchmark suite with a cross-model comparison for Mem0, Zep, and OpenAI
- 🏗️ Enabling seamless Memory exchange and integration across diverse systems.
- 🔧 More Tool Operation in MCP: Integration with more tool like modify, delete
- [🎯Completed]Parallelization Acceleration of PyPi:Parallelism for memory retrieval and model inference to reduce latency
- [🎯 Completed, internal testing] MemoryOS Platform: Browser-based Memory visualization analytics platform
Have ideas or suggestions? Contributions are welcome! Please feel free to submit issues or pull requests! 🚀
A more detailed documentation is coming soon 🚀, and we will update in the Documentation page.
If you find this project useful, please consider citing our paper:
@misc{kang2025memoryosaiagent,
title={Memory OS of AI Agent},
author={Jiazheng Kang and Mingming Ji and Zhe Zhao and Ting Bai},
year={2025},
eprint={2506.06326},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2506.06326},
}