|
OpenMemory is a local memory layer for LLM conversations, with a GUI-first workflow.
It lets you keep long-term context in a file store, browse commits/history, and move sessions across models from one interface.
- Python 3.11+
pip- (Optional)
gemini-clifor in-GUI Gemini chat - (Optional)
ollamafor local models in GUI - (Optional) Claude Desktop (launch from GUI link)
git clone <your-repo-url>
cd openmem
python -m venv .venv
source .venv/bin/activate
pip install -e .Create .env in project root:
MEMORY_STORE_PATH=/home/<you>/path/to/llm-memory
GEMINI_API_KEY=your_gemini_api_keyNotes:
MEMORY_STORE_PATHis where OpenMemory stores sessions/commits.GEMINI_API_KEYis required when usinggemini-clifrom GUI.
memory-guiOpen in browser:
http://127.0.0.1:8765
- In Settings, choose or browse your memory store path.
- Create a new session (for example:
French Revolution Chat). - Link a model to that session:
gemini-cli(pinned togemini-2.5-flash)ollama:<model>(auto-discovered fromollama list)claude-desktop(GUI opens Claude Desktop)
- Chat in GUI (Gemini/Ollama), or in Claude Desktop if Claude is linked.
- Commit staged context (auto commit metadata is generated by default).
- View timeline in History and per-commit data in Commit Details.
- Memory store path management
- Session creation/switching
- One-model-per-session linking
- Chat panel + staged context flow
- Auto-generated commit message/summary
- Commit history and detail browsing
- Tag/retrieval/search support through backend memory engine
- Restart
memory-gui. - Ensure WSL can call Windows tools (
powershell.exe). - If needed, set
MEMORY_GUI_PICKER_CMDmanually.
- Confirm
GEMINI_API_KEYis present in.env. - Restart
memory-guiafter editing.env. - Confirm
geminicommand works in your shell.
- Ensure
ollamais installed and running. - Run
ollama listin terminal and verify models exist.
