Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 29 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,6 +159,35 @@ uv run agent-memory mcp
uv run agent-memory mcp --mode sse --port 9000 --no-worker
```

### MCP config via uvx (recommended)

Use this in your MCP tool configuration (e.g., Claude Desktop mcp.json):

```json
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["--from", "agent-memory-server", "agent-memory", "mcp"],
"env": {
"DISABLE_AUTH": "true",
"REDIS_URL": "redis://localhost:6379",
"OPENAI_API_KEY": "<your-openai-key>"
}
}
}
}
```

Notes:
- API keys: Set either `OPENAI_API_KEY` (default models use OpenAI) or switch to Anthropic by setting `ANTHROPIC_API_KEY` and `GENERATION_MODEL` to an Anthropic model (e.g., `claude-3-5-haiku-20241022`).

- Make sure your MCP host can find `uvx` (on its PATH or by using an absolute command path).
- macOS: `brew install uv`
- If not on PATH, set `"command"` to the absolute path (e.g., `/opt/homebrew/bin/uvx` on Apple Silicon, `/usr/local/bin/uvx` on Intel macOS). On Linux, `~/.local/bin/uvx` is common. See https://docs.astral.sh/uv/getting-started/
- For production, remove `DISABLE_AUTH` and configure proper authentication.


## Documentation

📚 **[Full Documentation](https://redis.github.io/agent-memory-server/)** - Complete guides, API reference, and examples
Expand Down
2 changes: 1 addition & 1 deletion agent_memory_server/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
"""Redis Agent Memory Server - A memory system for conversational AI."""

__version__ = "0.12.4"
__version__ = "0.12.5"
1 change: 1 addition & 0 deletions agent_memory_server/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -235,6 +235,7 @@ class Settings(BaseSettings):
# Cloud
## Cloud region
region_name: str | None = None

## AWS Cloud credentials
aws_access_key_id: str | None = None
aws_secret_access_key: str | None = None
Expand Down
22 changes: 20 additions & 2 deletions agent_memory_server/extraction.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@
import ulid
from tenacity.asyncio import AsyncRetrying
from tenacity.stop import stop_after_attempt
from transformers import AutoModelForTokenClassification, AutoTokenizer, pipeline

# Lazy-import transformers in get_ner_model to avoid heavy deps at startup
from agent_memory_server.config import settings
from agent_memory_server.filters import DiscreteMemoryExtracted, MemoryType
from agent_memory_server.llms import (
Expand Down Expand Up @@ -61,9 +61,27 @@ def get_ner_model() -> Any:
"""
global _ner_model, _ner_tokenizer
if _ner_model is None:
# Lazy import to avoid importing heavy ML frameworks at process startup
try:
from transformers import (
AutoModelForTokenClassification,
AutoTokenizer,
pipeline as hf_pipeline,
)
except Exception as e:
logger.warning(
"Transformers not available or failed to import; NER disabled: %s", e
)
raise

_ner_tokenizer = AutoTokenizer.from_pretrained(settings.ner_model)
_ner_model = AutoModelForTokenClassification.from_pretrained(settings.ner_model)
return pipeline("ner", model=_ner_model, tokenizer=_ner_tokenizer)
return hf_pipeline("ner", model=_ner_model, tokenizer=_ner_tokenizer)

# If already initialized, import the lightweight symbol and return a new pipeline
from transformers import pipeline as hf_pipeline # type: ignore

return hf_pipeline("ner", model=_ner_model, tokenizer=_ner_tokenizer)


def extract_entities(text: str) -> list[str]:
Expand Down
19 changes: 19 additions & 0 deletions agent_memory_server/long_term_memory.py
Original file line number Diff line number Diff line change
Expand Up @@ -893,6 +893,25 @@ async def search_long_term_memories(
Returns:
MemoryRecordResults containing matching memories
"""
# If no query text is provided, perform a filter-only listing (no semantic search).
# This enables patterns like: "return all memories for this user/namespace".
if not (text or "").strip():
adapter = await get_vectorstore_adapter()
return await adapter.list_memories(
session_id=session_id,
user_id=user_id,
namespace=namespace,
created_at=created_at,
last_accessed=last_accessed,
topics=topics,
entities=entities,
memory_type=memory_type,
event_date=event_date,
memory_hash=memory_hash,
limit=limit,
offset=offset,
)

# Optimize query for vector search if requested.
search_query = text
optimized_applied = False
Expand Down
6 changes: 5 additions & 1 deletion agent_memory_server/mcp.py
Original file line number Diff line number Diff line change
Expand Up @@ -521,8 +521,12 @@ async def search_long_term_memory(
limit=limit,
offset=offset,
)
# Create a background tasks instance for the MCP call
from agent_memory_server.dependencies import HybridBackgroundTasks

background_tasks = HybridBackgroundTasks()
results = await core_search_long_term_memory(
payload, optimize_query=optimize_query
payload, background_tasks=background_tasks, optimize_query=optimize_query
)
return MemoryRecordResults(
total=results.total,
Expand Down
6 changes: 6 additions & 0 deletions agent_memory_server/vectorstore_adapter.py
Original file line number Diff line number Diff line change
Expand Up @@ -403,6 +403,12 @@ def parse_datetime(dt_val: str | float | None) -> datetime | None:
# Unix timestamp from Redis
return datetime.fromtimestamp(dt_val, tz=UTC)
if isinstance(dt_val, str):
# Try to parse as float first (Unix timestamp as string)
try:
timestamp = float(dt_val)
return datetime.fromtimestamp(timestamp, tz=UTC)
except ValueError:
pass
# ISO string from other backends
return datetime.fromisoformat(dt_val)
return None
Expand Down
45 changes: 18 additions & 27 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,12 @@ services:
image: redislabs/agent-memory-server:${REDIS_AGENT_MEMORY_VERSION:-latest}
ports:
- "8000:8000"
env_file:
- path: .env
required: false
environment:
- REDIS_URL=redis://redis:6379
- PORT=8000
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- GENERATION_MODEL=gpt-4o-mini
- EMBEDDING_MODEL=text-embedding-3-small
- LONG_TERM_MEMORY=True
Expand All @@ -44,11 +45,12 @@ services:
mcp:
profiles: ["standard", ""]
image: redislabs/agent-memory-server:${REDIS_AGENT_MEMORY_VERSION:-latest}
env_file:
- path: .env
required: false
environment:
- REDIS_URL=redis://redis:6379
- PORT=9050
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- GENERATION_MODEL=gpt-4o-mini
- EMBEDDING_MODEL=text-embedding-3-small
- LONG_TERM_MEMORY=True
Expand All @@ -66,10 +68,11 @@ services:
task-worker:
profiles: ["standard", ""]
image: redislabs/agent-memory-server:${REDIS_AGENT_MEMORY_VERSION:-latest}
env_file:
- path: .env
required: false
environment:
- REDIS_URL=redis://redis:6379
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- GENERATION_MODEL=gpt-4o-mini
- EMBEDDING_MODEL=text-embedding-3-small
- LONG_TERM_MEMORY=True
Expand All @@ -94,16 +97,12 @@ services:
image: redislabs/agent-memory-server-aws:${REDIS_AGENT_MEMORY_AWS_VERSION:-latest}
ports:
- "8000:8000"
env_file:
- path: .env
required: false
environment:
- REDIS_URL=redis://redis:6379
- PORT=8000
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
# AWS Bedrock configuration
- REGION_NAME=${REGION_NAME}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
- GENERATION_MODEL=anthropic.claude-haiku-4-5-20251001-v1:0
- EMBEDDING_MODEL=amazon.titan-embed-text-v2:0
- LONG_TERM_MEMORY=True
Expand All @@ -125,16 +124,12 @@ services:
mcp-aws:
profiles: ["aws"]
image: redislabs/agent-memory-server-aws:${REDIS_AGENT_MEMORY_AWS_VERSION:-latest}
env_file:
- path: .env
required: false
environment:
- REDIS_URL=redis://redis:6379
- PORT=9050
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
# AWS Bedrock configuration
- REGION_NAME=${REGION_NAME}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
- GENERATION_MODEL=anthropic.claude-haiku-4-5-20251001-v1:0
- EMBEDDING_MODEL=amazon.titan-embed-text-v2:0
- LONG_TERM_MEMORY=True
Expand All @@ -152,15 +147,11 @@ services:
task-worker-aws:
profiles: ["aws"]
image: redislabs/agent-memory-server-aws:${REDIS_AGENT_MEMORY_AWS_VERSION:-latest}
env_file:
- path: .env
required: false
environment:
- REDIS_URL=redis://redis:6379
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
# AWS Bedrock configuration
- REGION_NAME=${REGION_NAME}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
- GENERATION_MODEL=anthropic.claude-haiku-4-5-20251001-v1:0
- EMBEDDING_MODEL=amazon.titan-embed-text-v2:0
- LONG_TERM_MEMORY=True
Expand Down
26 changes: 26 additions & 0 deletions docs/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,32 @@ uv run agent-memory mcp --mode sse --no-worker
uv run agent-memory mcp --mode sse
```

### Using uvx in MCP clients

When configuring MCP-enabled apps (e.g., Claude Desktop), prefer `uvx` so the app can run the server without a local checkout:

```json
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["--from", "agent-memory-server", "agent-memory", "mcp"],
"env": {
"DISABLE_AUTH": "true",
"REDIS_URL": "redis://localhost:6379",
"OPENAI_API_KEY": "<your-openai-key>"
}
}
}
}
```

Notes:
- API keys: Default models use OpenAI. Set `OPENAI_API_KEY`. To use Anthropic instead, set `ANTHROPIC_API_KEY` and also `GENERATION_MODEL` to an Anthropic model (e.g., `claude-3-5-haiku-20241022`).
- Make sure your MCP host can find `uvx` (on its PATH or by using an absolute command path). macOS: `brew install uv`. If not on PATH, set `"command"` to an absolute path (e.g., `/opt/homebrew/bin/uvx` on Apple Silicon, `/usr/local/bin/uvx` on Intel macOS).
- For production, remove `DISABLE_AUTH` and configure auth.


**For production deployments**, you'll need to run a separate worker process:

```bash
Expand Down
53 changes: 46 additions & 7 deletions docs/mcp.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,29 +67,68 @@ You can use the MCP server that comes with this project in any application or SD

<img src="../claude.png">

For example, with Claude, use the following configuration:
For Claude, the easiest way is to use uvx (recommended):

```json
{
"mcpServers": {
"redis-memory-server": {
"memory": {
"command": "uvx",
"args": ["--from", "agent-memory-server", "agent-memory", "mcp"],
"env": {
"DISABLE_AUTH": "true",
"REDIS_URL": "redis://localhost:6379",
"OPENAI_API_KEY": "<your-openai-key>"
}
}
}
}
```

Notes:
- API keys: Default models use OpenAI. Set `OPENAI_API_KEY`. To use Anthropic instead, set `ANTHROPIC_API_KEY` and also `GENERATION_MODEL` to an Anthropic model (e.g., `claude-3-5-haiku-20241022`).
- Make sure your MCP host can find `uvx` (on its PATH or by using an absolute command path).
- macOS: `brew install uv`
- If not on PATH, set `"command"` to an absolute path (e.g., `/opt/homebrew/bin/uvx` on Apple Silicon, `/usr/local/bin/uvx` on Intel macOS). On Linux, `~/.local/bin/uvx` is common. See https://docs.astral.sh/uv/getting-started/ for distro specifics
- Set `DISABLE_AUTH=false` in production and configure proper auth per the Authentication guide.

If you’re running from a local checkout instead of PyPI, you can use `uv run` with a directory:

```json
{
"mcpServers": {
"memory": {
"command": "uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/REPO/DIRECTORY/agent-memory-server",
"run",
"agent-memory",
"mcp",
"--mode",
"stdio"
"mcp"
]
}
}
}
```

**NOTE:** On a Mac, this configuration requires that you use `brew install uv` to install uv. Probably any method that makes the `uv`
command globally accessible, so Claude can find it, would work.
Alternative (Anthropic):

```json
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["--from", "agent-memory-server", "agent-memory", "mcp"],
"env": {
"DISABLE_AUTH": "true",
"REDIS_URL": "redis://localhost:6379",
"ANTHROPIC_API_KEY": "<your-anthropic-key>",
"GENERATION_MODEL": "claude-3-5-haiku-20241022"
}
}
}
}
```

### Cursor

Expand Down
Loading