Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
---
title: "Integrating OpenAI Agent Builder with MemMachine MCP Server"
date: 2025-11-03T17:40:16-06:00
featured_image: "featured_image.png"
tags: ["AI Agent", "MCP", "Generative AI", "Agent Memory", "OpenAI", "OpenAI Builder", "featured"]
author: "Decheng Xu"
description: "Integrating OpenAI Agent Builder with MemMachine MCP Server is easy with our step-by-step guide."
aliases:
---

Integrating OpenAI Agent Builder with the MemMachine MCP server allows your AI agents to store and recall information, effectively giving them memory.

The setup connects your locally hosted MemMachine MCP endpoint with OpenAI’s cloud-based Agent Builder workflow system.

## Prerequisites

Before you begin, ensure the following are ready:

- MemMachine MCP HTTP server running and accessible in a local or remote environment on port 8080 (default).
- OpenAI account with access to Agent Builder.
- ngrok or similar tool to expose your local MCP server to the internet.

## Step 1: Start the MemMachine MCP Server

Start your MCP HTTP server with:

```bash
export MEMORY_CONFIG=/path/to/configuration.yml
uv run python -m memmachine.server.mcp_http --host 0.0.0.0 --port 8080
```

Ensure the server is running and accessible.

## Step 2: Open the OpenAI Agent Builder

In your browser, navigate to the OpenAI Agent Builder at https://platform.openai.com/agent-builder. Create a new workflow or open an existing one.

![OpenAI Agent Builder](./openai-agent-builder-new-workflow.png)

## Step 3: Configure MCP Endpoint in Agent Builder

On the workflow canvas, you’ll see a Start node and a default agent block.
Configure it as follows:

- Name: MemMachineMCP
- Instructions: You are an intelligent memory assistant powered by MemMachine
- Model: gpt-5
- Reasoning Effort: Low
- Include Chat History: Enabled

![Agent Configuration](./openai-agent-builder-configure-agent.png)

## Step 4: Add MCP Memory Tools

Click '+' next to the agent block to add tools. Search for "MCP" and add the following tools:

![MCP Tools](./openai-agent-builder-add-new-mcp-tools.png)

Then click "+ Server" to add your own connection.

![Add MCP Server](./openai-agent-builder-add-new-server.png)

## Step 5: Configure MCP Server Connection

Run this command if you haven’t started ngrok yet:

```bash
ngrok http 8080
```

This command will create a secure tunnel to your local MCP server, providing you with a public URL.

![ngrok MCP configuration](./ngrok-configuration.png)

> Important: Ensure your ngrok URL includes the /mcp/ suffix, for example: `https://abc123.ngrok-free.dev/mcp/`

You’ll now see a form titled “Connect to MCP Server”. Fill it out as follows:

- MCP Endpoint URL: `https://your-ngrok-url.ngrok-free.dev/mcp/`
- Label: Memmachine_MCP
- Authentication Type: Custom Headers
- Custom Headers:
- Key: user-id
- Value: user

Click "Connect" to save the configuration.

> Note: This `user-id` header is used by MemMachine MCP to associate memory operations (add/search) with a specific user.

![MCP Server Connection](./openai-agent-build-connect-to-an-mcp-server.png)

## Step 6: Approve Tools and Add Server

After connecting, approve the MCP tools to use the newly added server.

- add_memory → Memmachine_MCP
- search_memory → Memmachine_MCP

Select "Always require approval for all tool calls", then click "Add".

![Approve MCP Tools](./openai-agent-builder-approve-tools.png)

## Step 7: Test the Integration

Once your MemMachine MCP server is added, it will appear under “Tools” for your agent.

Try these example interactions in the Preview panel:

### Add Memory

In the chat input, type:

```text
I recently bought AirPods Pro 3.
```

### Search Memory

Then ask:

```text
What did I buy recently?
```

You should see the agent successfully storing and retrieving information via the MemMachine MCP server.

## Conclusion

By integrating OpenAI Agent Builder with MemMachine MCP, you empower your AI agents with persistent memory capabilities. This setup allows agents to remember past interactions, enhancing their usefulness and user experience.

## Recommendations

- User ID Consistency: Use consistent user IDs across different sessions to maintain memory continuity.
- Why this matters: Keeping a consistent user ID ensures that your memory context stays linked across clients such as Claude Desktop and OpenAI Agent Builder.
- ngrok: For production use, consider deploying your MCP server on a stable hosting solution rather than relying on ngrok for long-term accessibility.
- Authentication: Use "Custom Headers" for authentication to easily manage user identities. In the future, OAuth support will be added for enhanced security.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
104 changes: 104 additions & 0 deletions content/en/blog/2025/11/mcp-bridging-multiple-ai-platforms/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
---
title: "MemMachine MCP: Bridging Multiple AI Platforms with Shared Memory"
date: 2025-11-03T17:40:16-06:00
featured_image: "featured_image.png"
tags: ["AI Agent", "MCP", "Generative AI", "Agent Memory", "featured"]
author: "Decheng Xu"
description: "In a groundbreaking performance result, MemMachine reaches new heights on the challenging LoCoMo benchmark, setting a new standard for long-term conversational memory in AI agents. Discover how our multi-layered memory system is revolutionizing the field."
aliases:
---

AI agents are getting smarter, but there's a critical challenge: how do you enable different agents across different platforms to share the same memory? What if your OpenAI workflow, Claude Desktop assistant, and custom agents could all access the same user context seamlessly?
Working with Cedric Zhuang from the MemMachine engineering team, we've built a solution that does exactly that—using the Model Context Protocol (MCP) as a universal memory backend.

## The Challenge: Memory Silos

Most AI implementations today suffer from memory fragmentation. Your ChatGPT conversation knows one thing, your Claude assistant knows another, and your custom workflow remembers something entirely different. This creates frustration for users who expect AI to remember context across platforms.

The solution? A shared memory layer that multiple agents can read from and write to, regardless of which platform they're running on.

## Enter MemMachine MCP

MemMachine's MCP implementation acts as a universal memory Resource Server that any MCP-compatible client can connect to. By using a consistent MM_USER_ID (user identifier), different agents can access the same memory context, creating a truly unified AI experience.

In a [previous blog post](/blog/2025/11/integrating-openai-agent-builder-with-memmachine-mcp-server/), we demonstrated how to integrate OpenAI Agent Builder with MemMachine MCP. Now, let's explore how multiple AI platforms can share memory through MCP.

### How It Works: A Hospital Demo

To demonstrate this capability, we built a simple hospital scenario with three distinct interactions:

1. Front Desk Workflow (OpenAI Agent Builder)
The Front Desk Agent registers a new patient named John Doe, age 45, with contact 555-1234 and a chief complaint of headache. This information is stored using the add_memory tool with user ID patient-001.

```text
Prompt: Patient John Doe, age 45, contact: 555-1234, chief complaint: headache
```

![Front Desk Agent registering patient](front-desk-workflow.png)

2. Doctor Workflow (OpenAI Agent Builder)
The Doctor Agent retrieves the patient's information using search_memory

```text
Prompt: What is the patient’s chief complaint and basic info?
```

![Doctor Agent retrieving patient info](doctor-workflow.png)

The doctor diagnoses them with a migraine, prescribes Ibuprofen 400mg three times daily, and schedules a follow-up in 2 weeks. All updates are saved back to the shared memory using the same patient-001 identifier.

```text
Diagnosis: Migraine
Prescription: Ibuprofen 400mg, three times daily
Follow-up: In 2 weeks
```

![Doctor Agent updating patient record](doctor-workflow2.png)

3. Patient Query (Claude Desktop)
Later, the patient opens Claude Desktop and asks for a second opinion. Because Claude is configured with the same MM_USER_ID (patient-001), it instantly retrieves the complete medical record—including the original complaint, diagnosis, and prescription—all through the same MCP endpoint.

![Claude Desktop retrieving patient record](patient-request.png)

## The Key: Consistent User Identity

The magic happens through the user-id header. Every client—whether it's OpenAI Agent Builder, Claude Desktop, or a custom application—uses the same user identifier when connecting to the MemMachine MCP server. This ensures that memory operations (add/search) are consistently associated with the right user context.

## Setting Up the Integration

### Prerequisites

- MemMachine MCP HTTP server running locally or remotely on port 8080
- ngrok to expose your local endpoint to the internet
- Access to OpenAI Agent Builder or Claude Desktop

### Quick Start

1. Start your MCP server:

```bash
export MEMORY_CONFIG=/path/to/configuration.yml
uv run python -m memmachine.server.mcp_http --host 0.0.0.0 --port 8080
```

2. Expose it via ngrok:

```bash
ngrok http 8080
```

3. Connect your AI platform to the MCP endpoint at `https://your-ngrok-url.ngrok-free.dev/mcp/`
4. Configure custom headers with `user-id: <your-user-id>`

### Available Tools

- `add_memory`: Store new information to the user's profile
- `search_memory`: Retrieve relevant memories based on queries

## Why This Matters

This implementation proves that MemMachine MCP can serve as a universal memory backend across different AI platforms. The implications are significant:

- **Consistent User Experience**: Users get personalized interactions regardless of which AI platform they're using
- **Simplified Development**: Developers can build multi-agent systems without worrying about memory fragmentation
- **Scalability**: Easily add more AI platforms to the ecosystem by simply connecting them to the MCP server.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.