Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .vscode/mcp.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"cwd": "${workspaceFolder}",
"args": [
"run",
"basic_mcp_stdio.py"
"servers/basic_mcp_stdio.py"
]
},
"expenses-mcp-http": {
Expand All @@ -25,7 +25,7 @@
"debugpy",
"--listen",
"0.0.0.0:5678",
"basic_mcp_stdio.py"
"servers/basic_mcp_stdio.py"
]
}
},
Expand Down
110 changes: 42 additions & 68 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ A demonstration project showcasing Model Context Protocol (MCP) implementations
- [Python Scripts](#python-scripts)
- [MCP Server Configuration](#mcp-server-configuration)
- [Debugging](#debugging)
- [License](#license)

## Prerequisites

Expand All @@ -25,15 +24,15 @@ A demonstration project showcasing Model Context Protocol (MCP) implementations

1. Install dependencies using `uv`:

```bash
uv sync
```
```bash
uv sync
```

2. Copy `.env-sample` to `.env` and configure your environment variables:

```bash
cp .env-sample .env
```
```bash
cp .env-sample .env
```

3. Edit `.env` with your API credentials. Choose one of the following providers by setting `API_HOST`:
- `github` - GitHub Models (requires `GITHUB_TOKEN`)
Expand All @@ -43,14 +42,14 @@ cp .env-sample .env

## Python Scripts

Run any script with: `uv run <script_name>`
Run any script with: `uv run <script_path>`

- **basic_mcp_http.py** - MCP server with HTTP transport on port 8000
- **basic_mcp_stdio.py** - MCP server with stdio transport for VS Code integration
- **langchainv1_mcp_http.py** - LangChain agent with MCP integration
- **langchainv1_mcp_github.py** - LangChain tool filtering demo with GitHub MCP (requires `GITHUB_TOKEN`)
- **openai_agents_tool_filtering.py** - OpenAI Agents SDK tool filtering demo with Microsoft Learn MCP
- **agentframework_mcp_learn.py** - Microsoft Agent Framework integration with MCP
- **servers/basic_mcp_http.py** - MCP server with HTTP transport on port 8000
- **servers/basic_mcp_stdio.py** - MCP server with stdio transport for VS Code integration
- **agents/langchainv1_http.py** - LangChain agent with MCP integration
- **agents/langchainv1_github.py** - LangChain tool filtering demo with GitHub MCP (requires `GITHUB_TOKEN`)
- **agents/agentframework_learn.py** - Microsoft Agent Framework integration with MCP
- **agents/agentframework_http.py** - Microsoft Agent Framework integration with local Expenses MCP server

## MCP Server Configuration

Expand All @@ -63,22 +62,25 @@ The [MCP Inspector](https://github.com/modelcontextprotocol/inspector) is a deve
**For stdio servers:**

```bash
npx @modelcontextprotocol/inspector uv run basic_mcp_stdio.py
npx @modelcontextprotocol/inspector uv run servers/basic_mcp_stdio.py
```

**For HTTP servers:**

1. Start the HTTP server:
```bash
uv run basic_mcp_http.py
```

```bash
uv run servers/basic_mcp_http.py
```

2. In another terminal, run the inspector:
```bash
npx @modelcontextprotocol/inspector http://localhost:8000/mcp
```

```bash
npx @modelcontextprotocol/inspector http://localhost:8000/mcp
```

The inspector provides a web interface to:

- View available tools, resources, and prompts
- Test tool invocations with custom parameters
- Inspect server responses and errors
Expand All @@ -92,65 +94,37 @@ The `.vscode/mcp.json` file configures MCP servers for GitHub Copilot integratio

- **expenses-mcp**: stdio transport server for production use
- **expenses-mcp-debug**: stdio server with debugpy on port 5678
- **expenses-mcp-http**: HTTP transport server at `http://localhost:8000/mcp`
- **expenses-mcp-http-debug**: stdio server with debugpy on port 5679
- **expenses-mcp-http**: HTTP transport server at `http://localhost:8000/mcp`. You must start this server manually with `uv run servers/basic_mcp_http.py` before using it.

**Switching Servers:**

Configure which server GitHub Copilot uses by selecting it in the Chat panel selecting the tools icon.
Configure which server GitHub Copilot uses by opening the Chat panel, selecting the tools icon, and choosing the desired MCP server from the list.

## Debugging

### Debug Configurations

The `.vscode/launch.json` provides four debug configurations:

#### Launch Configurations (Start server with debugging)

1. **Launch MCP HTTP Server (Debug)**
- Directly starts `basic_mcp_http.py` with debugger attached
- Best for: Standalone testing and LangChain script debugging

2. **Launch MCP stdio Server (Debug)**
- Directly starts `basic_mcp_stdio.py` with debugger attached
- Best for: Testing stdio communication
![Servers selection dialog](readme_serverselect.png)

#### Attach Configurations (Attach to running server)
**Example input**

3. **Attach to MCP Server (stdio)** - Port 5678
- Attaches to server started via `expenses-mcp-debug` in `mcp.json`
- Best for: Debugging during GitHub Copilot Chat usage
Use a query like this to test the expenses MCP server:

4. **Attach to MCP Server (HTTP)** - Port 5679
- Attaches to server started via `expenses-mcp-http-debug` in `mcp.json`
- Best for: Debugging HTTP server during Copilot usage
```
Log expense for 50 bucks of pizza on my amex today
```

### Debugging Workflow
![Example GitHub Copilot Chat Input](readme_samplequery.png)

#### Option 1: Launch and Debug (Standalone)
## Debugging

Use this approach for debugging with MCP Inspector or LangChain scripts:
The `.vscode/launch.json` provides one debug configuration:

1. Set breakpoints in `basic_mcp_http.py` or `basic_mcp_stdio.py`
2. Press `Cmd+Shift+D` to open Run and Debug
3. Select "Launch MCP HTTP Server (Debug)" or "Launch MCP stdio Server (Debug)"
4. Press `F5` or click the green play button
5. Connect MCP Inspector or run your LangChain script to trigger breakpoints
- For HTTP: `npx @modelcontextprotocol/inspector http://localhost:8000/mcp`
- For stdio: `npx @modelcontextprotocol/inspector uv run basic_mcp_stdio.py` (start without debugger first)
**Attach to MCP Server (stdio)**: Attaches to server started via `expenses-mcp-debug` in `mcp.json`

#### Option 2: Attach to Running Server (Copilot Integration)
To debug an MCP server with GitHub Copilot Chat:

1. Set breakpoints in your MCP server file
1. Start the debug server via `mcp.json` configuration:
- Select `expenses-mcp-debug` or `expenses-mcp-http-debug`
1. Set breakpoints in the MCP server code in `servers/basic_mcp_stdio.py`
1. Start the debug server via `mcp.json` configuration by selecting `expenses-mcp-debug`
1. Press `Cmd+Shift+D` to open Run and Debug
1. Select appropriate "Attach to MCP Server" configuration
1. Press `F5` to attach
1. Select correct expense mcp server in GitHub Copilot Chat tools
1. Select "Attach to MCP Server (stdio)" configuration
1. Press `F5` or the play button to start the debugger
1. Select the expenses-mcp-debug server in GitHub Copilot Chat tools
1. Use GitHub Copilot Chat to trigger the MCP tools
1. Debugger pauses at breakpoints

## License

MIT
1. Debugger pauses at breakpoints
83 changes: 83 additions & 0 deletions agents/agentframework_http.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
from __future__ import annotations

import asyncio
import logging
import os

from azure.identity import DefaultAzureCredential
from dotenv import load_dotenv
from rich import print
from rich.logging import RichHandler

from agent_framework import ChatAgent, MCPStreamableHTTPTool
from agent_framework.azure import AzureOpenAIChatClient
from agent_framework.openai import OpenAIChatClient

# Configure logging
logging.basicConfig(
level=logging.WARNING,
format="%(message)s",
datefmt="[%X]",
handlers=[RichHandler()]
)
logger = logging.getLogger("agentframework_mcp_http")

# Load environment variables
load_dotenv(override=True)

# Constants
MCP_SERVER_URL = "http://localhost:8000/mcp/"

# Configure chat client based on API_HOST
API_HOST = os.getenv("API_HOST", "github")

if API_HOST == "azure":
client = AzureOpenAIChatClient(
credential=DefaultAzureCredential(),
deployment_name=os.environ.get("AZURE_OPENAI_CHAT_DEPLOYMENT"),
endpoint=os.environ.get("AZURE_OPENAI_ENDPOINT"),
api_version=os.environ.get("AZURE_OPENAI_VERSION"),
)
elif API_HOST == "github":
client = OpenAIChatClient(
base_url="https://models.github.ai/inference",
api_key=os.environ["GITHUB_TOKEN"],
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4o"),
)
elif API_HOST == "ollama":
client = OpenAIChatClient(
base_url=os.environ.get("OLLAMA_ENDPOINT", "http://localhost:11434/v1"),
api_key="none",
model_id=os.environ.get("OLLAMA_MODEL", "llama3.1:latest"),
)
else:
client = OpenAIChatClient(
api_key=os.environ.get("OPENAI_API_KEY"), model_id=os.environ.get("OPENAI_MODEL", "gpt-4o")
)


async def http_mcp_example() -> None:
"""
Demonstrate MCP integration with the local Expenses MCP server.

Creates an agent that can help users log expenses
using the Expenses MCP server at http://localhost:8000/mcp/.
"""
async with (
MCPStreamableHTTPTool(
name="Expenses MCP Server",
url=MCP_SERVER_URL
) as mcp_server,
ChatAgent(
chat_client=client,
name="Expenses Agent",
instructions="You help users to log expenses.",
) as agent,
):
user_query = "yesterday I bought a laptop for $1200 using my visa."
result = await agent.run(user_query, tools=mcp_server)
print(result)


if __name__ == "__main__":
asyncio.run(http_mcp_example())
File renamed without changes.
File renamed without changes.
File renamed without changes.
Binary file added readme_samplequery.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added readme_serverselect.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
File renamed without changes.
File renamed without changes.
1 change: 0 additions & 1 deletion expenses.csv → servers/expenses.csv
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,3 @@ date,amount,category,description,payment_method
2025-08-27,50.0,gadget,phone case,AMEX
2025-10-25,50.0,shopping,stuff,AMEX
2025-10-21,1200.0,gadget,Laptop purchase,VISA
2025-10-21,1200.0,gadget,Laptop purchase,VISA
Loading