Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
---
title: Deploy an MCP server on a Raspberry Pi 5 and interact with it using the AI agent

minutes_to_complete: 30

who_is_this_for: This Learning Path targets LLM and IoT developers who already know their way around Large Language Model (LLM) concepts and networking. It walks you through deploying a lightweight Model Context Protocol (MCP) server on a Raspberry Pi 5 and shows you how to interact with it via the OpenAI-Agent SDK.

learning_objectives:
- Understand how to Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5
- Design and register custom tools for the AI Agent
- Create custom endpoints
- Learn uv — a fast, efficient Python package manager

prerequisites:
- Rapberry Pi
- Basic understanding of Python and prompt engineering.
- Understanding of LLM and AI Agent fundamentals

author: Andrew Choi

skilllevels: Introductory
subjects: ML
armips:
- Cortex-A76
tools_software_languages:
- Python
- IoT
- AI
- MCP

operatingsystems:
- Linux

further_reading:
- resource:
title: fastmcp
link: https://github.com/jlowin/fastmcp
type: documentation
- resource:
title: OpenAI Agents SDK
link: https://openai.github.io/openai-agents-python/
type: blog


### FIXED, DO NOT MODIFY
# ================================================================================
weight: 1 # _index.md always has weight of 1 to order correctly
layout: "learningpathall" # All files under learning paths have this same wrapper
learning_path_main_page: "yes" # This should be surfaced when looking for related content. Only set for _index.md of learning path content.
---
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
# ================================================================================
# FIXED, DO NOT MODIFY THIS FILE
# ================================================================================
weight: 21 # Set to always be larger than the content in this path to be at the end of the navigation.
title: "Next Steps" # Always the same, html page title.
layout: "learningpathall" # All files under learning paths have this same wrapper for Hugo processing.
---
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
---
title: Introduction to Model Context Protocol and uv
weight: 2

### FIXED, DO NOT MODIFY
layout: learningpathall
---

## Model Context Protocol (MCP)

The **Model Context Protocol (MCP)** is an open specification for wiring Large-Language-Model (LLM) agents to the *context* they need — whether that context is a database, a local sensor, or a SaaS API.
Think of it as USB-C for AI: once a tool or data source speaks MCP, any compliant LLM client can “plug in” and start using it immediately.

### Why use MCP?
- **Plug-and-play integrations:** A growing catalog of pre-built MCP servers (filesystem, shell, vector stores, web-scraping, etc.) gives your agent instant super-powers with zero custom glue code.

- **Model/vendor agnostic:** Because the protocol lives outside the model, you can swap GPT-4, Claude, or your own fine-tuned model without touching the integration layer.

- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data never leaves the perimeter unless you choose.

- **Cross-ecosystem momentum:** Recent roll-outs—from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support—show the spec is gathering real-world traction.

### High-level architecture
![mcp server](./mcp.png)
- **MCP Host:** the LLM-powered application (Claude Desktop, an IDE plugin, OpenAI Agents SDK, etc.).
- **MCP Client:** the runtime shim that keeps a 1-to-1 connection with each server.
- **MCP Server:** a lightweight process that advertises tools (functions) over MCP.
- **Local data sources:** files, databases, or sensors your server can read directly.
- **Remote services:** external APIs the server can call on the host’s behalf.

{{% notice Note %}}
Learn more about AI Agents in the [AI Agent on CPU learning path](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
{{% /notice %}}

## UV: The Fast, All-in-One Python Package Manager

**uv** is a next-generation, Rust-based package manager that unifies pip, virtualenv, Poetry, and more—offering 10×–100× faster installs, built-in virtual environment handling, robust lockfiles, and full compatibility with the Python ecosystem.

### Install uv
- macOS / Linux
```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```
- Windows
```bash
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
```

### Initialize a Project
1. Create & enter your project folder:
```bash
mkdir my-project && cd my-project
```
2. Run
```bash
uv init
```

This scaffolds:
- .venv/ (auto-created virtual environment)
- pyproject.toml (project metadata & dependencies)
- .python-version (pinned interpreter)
- README.md, .gitignore, and a sample main.py

### Install Dependencies
- Add one or more packages to your project:
```bash
uv add requests numpy pandas
```
> Updates both pyproject.toml and the lockfile (uv.lock)

- Remove a package (and its unused sub-deps):
```bash
uv remove numpy
```

- Install from an existing requirements.txt (e.g. when migrating):
```bash:
uv pip install -r requirements.txt
```

All installs happen inside your project’s .venv, and UV’s lockfile guarantees repeatable environments.
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
---
title: Build & Run an AI Agent on Your Workstation
weight: 4

### FIXED, DO NOT MODIFY
layout: learningpathall
---

### Create an AI Agent and point it at your Pi's MCP Server
1. Bootstrap the Agent Project
```bash
# create & enter folder
mkdir mcp-agent && cd mcp-agent
```
2. scaffold with **uv**
```bash
uv init
```
3. install **OpenAI Agents SDK** + **dotenv**
```bash
uv add openai-agents python-dotenv
```
4. Create a `.env` file with your OpenAI key:
```bash
echo -n "OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>" > .env
```

### Write the Agent Client (main.py)
```python
import asyncio, os
from dotenv import load_dotenv

# disable Agents SDK tracing for cleaner output
os.environ["OPENAI_AGENTS_DISABLE_TRACING"] = "1"
load_dotenv()

from agents import Agent, Runner, set_default_openai_key
from agents.mcp import MCPServerSse
from agents.model_settings import ModelSettings

async def run(mcp_server: list[MCPServer]):
set_default_openai_key(os.getenv("OPENAI_API_KEY"))

agent = Agent(
model="gpt-4.1-2025-04-14",
name="Assistant",
instructions="Use the tools to answer the user's query",
mcp_servers=mcp_server,
model_settings=ModelSettings(tool_choice="required"),
)

for message in ["What is the CPU temperature?", "How is the weather in Cambridge?"]:
print(f"Running: {message}")
result = await Runner.run(starting_agent=agent, input=message)
print(f"Response: {result.final_output}")

async def main():
# replace URL with your ngrok-generated endpoint
url = "<YOUR_NGROK_URL>/sse"

async with MCPServerSse(
name="RPI5 MCP Server",
params={"url": url},
client_session_timeout_seconds=60,
) as server1:
await run([server1])

if __name__ == "__main__":
asyncio.run(main())
```

### Execute the Agent
```bash
uv run main.py
```
You should see output like:
```output
Running: What is the CPU temperature?
Response: The current CPU temperature is 48.8°C.
Running: How is the weather in Cambridge?
The weather in Cambridge is currently partly cloudy with a temperature of around 10°C. The wind is blowing at approximately 17 km/h. Visibility is good at 10 km, and there is no precipitation expected at the moment. The weather should be pleasant throughout the day with temperatures rising to about 15°C by midday.
```

Congratulations! Your local AI Agent just called the MCP server on your Raspberry Pi and fetched the CPU temperature and the weather information.

This lightweight protocol isn’t just a game-changer for LLM developers—it also empowers IoT engineers to transform real-world data streams and give AI direct, reliable control over any connected device.

### Next Steps
- **Expand Your Toolset**
- Write additional `@mcp.tool()` functions for Pi peripherals (GPIO pins, camera, I²C sensors, etc.)
- Combine multiple MCP servers (e.g. filesystem, web-scraper, vector-store memory) for richer context

- **Integrate with IoT Platforms**
- Hook into Home Assistant or Node-RED via MCP
- Trigger real-world actions (turn on LEDs, read environmental sensors, control relays)
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
---
title: Set Up an MCP Server on Your Raspberry Pi
weight: 3

### FIXED, DO NOT MODIFY
layout: learningpathall
---

## Expose Raspberry Pi MCP Server via ngrok

This guide shows you how to:

1. Install **uv** (the Rust-powered Python manager)
2. Bootstrap a simple **MCP** server on your Raspberry Pi that reads the CPU temperature and searches the weather data
3. Expose it to the internet with **ngrok**

### Prerequisites

- A **Raspberry Pi 5** (or other ARMv8 Pi) running Raspberry Pi OS (64-bit)
- Basic familiarity with Python and the terminal


#### 1. Install uv
On Raspberry Pi Terminal:
```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```

{{% notice Note %}}
After the script finishes, restart your terminal so that the uv command is on your PATH.
{{% /notice %}}

#### 2. Bootstrap the MCP Project
1. Create a project directory and enter it:
```bash
mkdir mcp
cd mcp
```
2. Initialize with uv (this creates pyproject.toml, .venv/, etc.):
```bash
uv init
```
3. Install the dependencies:
```uv
uv pip install fastmcp==2.2.10
uv add requests
```

#### 3. Write Your MCP Server (server.py)
1. Create the server file:
```bash
touch server.py
```
2. Edit `server.py` with the following contents:
```bash
import subprocess, re
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("RaspberryPi MCP Server")

@mcp.tool()
def read_temp():
"""
Description: Raspberry Pi's CPU core temperature
Return: Temperature in °C (float)
"""
print(f"[debug-server] read_temp()")

out = subprocess.check_output(["vcgencmd", "measure_temp"]).decode()
temp_c = float(re.search(r"[-\d.]+", out).group())
return temp_c

@mcp.tool()
def get_current_weather(city: str) -> str:
"""
Description: Get Current Weather data in the {city}
Args:
city: Name of the city
Return: Current weather data of the city
"""
print(f"[debug-server] get_current_weather({city})")

endpoint = "https://wttr.in"
response = requests.get(f"{endpoint}/{city}")
return response.text

if __name__ == "__main__":
mcp.run(transport="sse")
```

#### 4. Run the MCP Server
```python
uv run server.py
```
By default, FastMCP will listen on port **8000** and serve your tools via **Server-Sent Events (SSE)**.

#### 5. Install & Configure ngrok
1. Add ngrok’s APT repo and install:
```bash
curl -sSL https://ngrok-agent.s3.amazonaws.com/ngrok.asc \
| sudo tee /etc/apt/trusted.gpg.d/ngrok.asc >/dev/null \
&& echo "deb https://ngrok-agent.s3.amazonaws.com buster main" \
| sudo tee /etc/apt/sources.list.d/ngrok.list \
&& sudo apt update \
&& sudo apt install ngrok
```
2. Authenticate your account:
```bash
ngrok config add-authtoken <YOUR_NGROK_AUTHTOKEN>
```
3. Expose port 8000:
```bash
ngrok http 8000
```
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`)—you’ll use this as your MCP endpoint.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.