Skip to content

user-anto/OpenMemory

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenMemory logo

OpenMemory

OpenMemory is a local memory layer for LLM conversations, with a GUI-first workflow.

It lets you keep long-term context in a file store, browse commits/history, and move sessions across models from one interface.


Setup

1. Prerequisites

  • Python 3.11+
  • pip
  • (Optional) gemini-cli for in-GUI Gemini chat
  • (Optional) ollama for local models in GUI
  • (Optional) Claude Desktop (launch from GUI link)

2. Install

git clone <your-repo-url>
cd openmem
python -m venv .venv
source .venv/bin/activate
pip install -e .

3. Configure .env

Create .env in project root:

MEMORY_STORE_PATH=/home/<you>/path/to/llm-memory
GEMINI_API_KEY=your_gemini_api_key

Notes:

  • MEMORY_STORE_PATH is where OpenMemory stores sessions/commits.
  • GEMINI_API_KEY is required when using gemini-cli from GUI.

4. Start GUI

memory-gui

Open in browser:

http://127.0.0.1:8765

First-Time GUI Flow

  1. In Settings, choose or browse your memory store path.
  2. Create a new session (for example: French Revolution Chat).
  3. Link a model to that session:
  • gemini-cli (pinned to gemini-2.5-flash)
  • ollama:<model> (auto-discovered from ollama list)
  • claude-desktop (GUI opens Claude Desktop)
  1. Chat in GUI (Gemini/Ollama), or in Claude Desktop if Claude is linked.
  2. Commit staged context (auto commit metadata is generated by default).
  3. View timeline in History and per-commit data in Commit Details.

What the GUI Includes

  • Memory store path management
  • Session creation/switching
  • One-model-per-session linking
  • Chat panel + staged context flow
  • Auto-generated commit message/summary
  • Commit history and detail browsing
  • Tag/retrieval/search support through backend memory engine

Troubleshooting (GUI)

Browse button does nothing on WSL

  • Restart memory-gui.
  • Ensure WSL can call Windows tools (powershell.exe).
  • If needed, set MEMORY_GUI_PICKER_CMD manually.

Gemini linked but chat fails

  • Confirm GEMINI_API_KEY is present in .env.
  • Restart memory-gui after editing .env.
  • Confirm gemini command works in your shell.

No Ollama models in dropdown

  • Ensure ollama is installed and running.
  • Run ollama list in terminal and verify models exist.

About

Democratizing Context Memory for LLMs. Use the LLM of your choice on the memory of your choice.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors