Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion openai_agents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,9 @@ Each directory contains a complete example with its own README for detailed inst
- **[Tools](./tools/README.md)** - Demonstrates available tools such as file search, image generation, and others.
- **[Handoffs](./handoffs/README.md)** - Agents collaborating via handoffs.
- **[Hosted MCP](./hosted_mcp/README.md)** - Using the MCP client functionality of the OpenAI Responses API.
- **[MCP](./mcp/README.md)** - Local MCP servers (filesystem/stdio, streamable HTTP, SSE, prompt server) integrated with Temporal workflows.
- **[Model Providers](./model_providers/README.md)** - Using custom LLM providers (e.g., Anthropic via LiteLLM).
- **[Research Bot](./research_bot/README.md)** - Multi-agent research system with specialized roles: a planner agent, search agent, and writer agent working together to conduct comprehensive research.
- **[Customer Service](./customer_service/README.md)** - Interactive customer service agent with escalation capabilities, demonstrating conversational workflows.
- **[Reasoning Content](./reasoning_content/README.md)** - Example of how to retrieve the thought process of reasoning models.
- **[Financial Research Agent](./financial_research_agent/README.md)** - Multi-agent financial research system with planner, search, analyst, writer, and verifier agents collaborating.

91 changes: 91 additions & 0 deletions openai_agents/mcp/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
# MCP Examples

Integration with MCP (Model Context Protocol) servers using OpenAI agents in Temporal workflows.

*Adapted from [OpenAI Agents SDK MCP examples](https://github.com/openai/openai-agents-python/tree/main/examples/mcp)*

Before running these examples, be sure to review the [prerequisites and background on the integration](../README.md).


## Running the Examples

### Stdio MCP

First, start the worker:
```bash
uv run openai_agents/mcp/run_file_system_worker.py
```

Run the workflow:
```bash
uv run openai_agents/mcp/run_file_system_workflow.py
```

This sample assumes that the worker and `run_file_system_workflow.py` are on the same machine.


### Streamable HTTP MCP

First, start the worker:
```bash
uv run openai_agents/mcp/servers/tools_server.py --transport=streamable-http
```

Then start the worker:
```bash
uv run openai_agents/mcp/run_streamable_http_worker.py
```

Finally, run the workflow:
```bash
uv run openai_agents/mcp/run_streamable_http_workflow.py
```

### SSE MCP

First, start the MCP server:
```bash
uv run openai_agents/mcp/servers/tools_server.py --transport=sse
```

Then start the worker:
```bash
uv run openai_agents/mcp/run_sse_worker.py
```

Finally, run the workflow:
```bash
uv run openai_agents/mcp/run_sse_workflow.py
```

### Prompt Server MCP

First, start the MCP server:
```bash
uv run openai_agents/mcp/servers/prompt_server.py
```

Then start the worker:
```bash
uv run openai_agents/mcp/run_prompt_server_worker.py
```

Finally, run the workflow:
```bash
uv run openai_agents/mcp/run_prompt_server_workflow.py
```


### Memory MCP (Research Scratchpad)

Demonstrates durable note-taking with the Memory MCP server: write seed notes, query by tags, synthesize a brief with citations, then update and delete notes.

Start the worker:
```bash
uv run openai_agents/mcp/run_memory_research_scratchpad_worker.py
```

Run the research scratchpad workflow:
```bash
uv run openai_agents/mcp/run_memory_research_scratchpad_workflow.py
```
62 changes: 62 additions & 0 deletions openai_agents/mcp/run_file_system_worker.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
from __future__ import annotations

import asyncio
import logging
import os
from datetime import timedelta

from agents.mcp import MCPServerStdio
from temporalio.client import Client
from temporalio.contrib.openai_agents import (
ModelActivityParameters,
OpenAIAgentsPlugin,
StatelessMCPServerProvider,
)
from temporalio.worker import Worker

from openai_agents.mcp.workflows.file_system_workflow import FileSystemWorkflow


async def main():
logging.basicConfig(level=logging.INFO)
current_dir = os.path.dirname(os.path.abspath(__file__))
samples_dir = os.path.join(current_dir, "sample_files")

file_system_server = StatelessMCPServerProvider(
lambda: MCPServerStdio(
name="FileSystemServer",
params={
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", samples_dir],
},
)
)

# Create client connected to server at the given address
client = await Client.connect(
"localhost:7233",
plugins=[
OpenAIAgentsPlugin(
model_params=ModelActivityParameters(
start_to_close_timeout=timedelta(seconds=60)
),
mcp_server_providers=[file_system_server],
),
],
)

worker = Worker(
client,
task_queue=f"openai-agents-mcp-filesystem-task-queue",
workflows=[
FileSystemWorkflow,
],
activities=[
# No custom activities needed for these workflows
],
)
await worker.run()


if __name__ == "__main__":
asyncio.run(main())
29 changes: 29 additions & 0 deletions openai_agents/mcp/run_file_system_workflow.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
import asyncio

from temporalio.client import Client
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin

from openai_agents.mcp.workflows.file_system_workflow import FileSystemWorkflow


async def main():
# Create client connected to server at the given address
client = await Client.connect(
"localhost:7233",
plugins=[
OpenAIAgentsPlugin(),
],
)

# Execute a workflow
result = await client.execute_workflow(
FileSystemWorkflow.run,
id="file-system-workflow",
task_queue="openai-agents-mcp-filesystem-task-queue",
)

print(f"Result: {result}")


if __name__ == "__main__":
asyncio.run(main())
59 changes: 59 additions & 0 deletions openai_agents/mcp/run_memory_research_scratchpad_worker.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
from __future__ import annotations
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe rename to match the other examples.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good suggestion


import asyncio
import logging
from datetime import timedelta

from agents.mcp import MCPServerStdio
from temporalio.client import Client
from temporalio.contrib.openai_agents import (
ModelActivityParameters,
OpenAIAgentsPlugin,
StatefulMCPServerProvider,
)
from temporalio.worker import Worker

from openai_agents.mcp.workflows.memory_research_scratchpad_workflow import (
MemoryResearchScratchpadWorkflow,
)


async def main():
logging.basicConfig(level=logging.INFO)

memory_server_provider = StatefulMCPServerProvider(
lambda: MCPServerStdio(
name="MemoryServer",
params={
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-memory"],
},
)
)

# Create client connected to server at the given address
client = await Client.connect(
"localhost:7233",
plugins=[
OpenAIAgentsPlugin(
model_params=ModelActivityParameters(
start_to_close_timeout=timedelta(seconds=60)
),
mcp_server_providers=[memory_server_provider],
),
],
)

worker = Worker(
client,
task_queue="openai-agents-mcp-memory-task-queue",
workflows=[
MemoryResearchScratchpadWorkflow,
],
activities=[],
)
await worker.run()


if __name__ == "__main__":
asyncio.run(main())
29 changes: 29 additions & 0 deletions openai_agents/mcp/run_memory_research_scratchpad_workflow.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
from __future__ import annotations

import asyncio

from temporalio.client import Client
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin

from openai_agents.mcp.workflows.memory_research_scratchpad_workflow import (
MemoryResearchScratchpadWorkflow,
)


async def main():
client = await Client.connect(
"localhost:7233",
plugins=[OpenAIAgentsPlugin()],
)

result = await client.execute_workflow(
MemoryResearchScratchpadWorkflow.run,
id="memory-research-scratchpad-workflow",
task_queue="openai-agents-mcp-memory-task-queue",
)

print(f"Result:\n{result}")


if __name__ == "__main__":
asyncio.run(main())
64 changes: 64 additions & 0 deletions openai_agents/mcp/run_prompt_server_worker.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
from __future__ import annotations

import asyncio
import logging
from datetime import timedelta

from agents.mcp import MCPServerStreamableHttp
from temporalio.client import Client
from temporalio.contrib.openai_agents import (
ModelActivityParameters,
OpenAIAgentsPlugin,
StatelessMCPServerProvider,
)
from temporalio.worker import Worker

from openai_agents.mcp.workflows.prompt_server_workflow import PromptServerWorkflow


async def main():
logging.basicConfig(level=logging.INFO)

print("Setting up worker...\n")

try:
prompt_server_provider = StatelessMCPServerProvider(
lambda: MCPServerStreamableHttp(
name="PromptServer",
params={
"url": "http://localhost:8000/mcp",
},
)
)

# Create client connected to server at the given address
client = await Client.connect(
"localhost:7233",
plugins=[
OpenAIAgentsPlugin(
model_params=ModelActivityParameters(
start_to_close_timeout=timedelta(seconds=120)
),
mcp_server_providers=[prompt_server_provider],
),
],
)

worker = Worker(
client,
task_queue="openai-agents-mcp-prompt-task-queue",
workflows=[
PromptServerWorkflow,
],
activities=[
# No custom activities needed for these workflows
],
)
await worker.run()
except Exception as e:
print(f"Worker failed: {e}")
raise


if __name__ == "__main__":
asyncio.run(main())
29 changes: 29 additions & 0 deletions openai_agents/mcp/run_prompt_server_workflow.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
from __future__ import annotations

import asyncio

from temporalio.client import Client
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin

from openai_agents.mcp.workflows.prompt_server_workflow import PromptServerWorkflow


async def main():
# Create client connected to server at the given address
client = await Client.connect(
"localhost:7233",
plugins=[OpenAIAgentsPlugin()],
)

# Execute a workflow
result = await client.execute_workflow(
PromptServerWorkflow.run,
id="prompt-server-workflow",
task_queue="openai-agents-mcp-prompt-task-queue",
)

print(f"Result: {result}")


if __name__ == "__main__":
asyncio.run(main())
Loading
Loading