Skip to content
27 changes: 12 additions & 15 deletions content/learning-paths/cross-platform/mcp-ai-agent/_index.md
Original file line number Diff line number Diff line change
@@ -1,35 +1,32 @@
---
title: Deploy an MCP server on a Raspberry Pi 5 and interact with it using an AI agent
title: Deploy an MCP Server on Raspberry Pi 5 for AI Agent Interaction using OpenAI SDK

draft: true
cascade:
draft: true

minutes_to_complete: 30

who_is_this_for: This Learning Path targets LLM and IoT developers who are familiar with Large Language Model (LLM) concepts and networking. You will learn how to deploy a lightweight Model Context Protocol (MCP) server on a Raspberry Pi 5 and interact with it via the OpenAI-Agent SDK.
who_is_this_for: This Learning Path is for LLM and IoT developers who want to run and interact with AI agents on edge devices like the Raspberry Pi 5. You'll learn how to deploy a lightweight Model Context Protocol (MCP) server and use the OpenAI Agent SDK to create and register tools for intelligent local inference.

learning_objectives:
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5
- Design and register custom tools for the AI Agent
- Create custom endpoints
- Learn about uv — a fast, efficient Python package manager
- Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5 for local AI agent execution.
- Use the OpenAI Agent SDK to interact with a local AI agent.
- Design and register custom tools for the agent tasks.
- Learn about uv — a fast, efficient Python package manager for efficient local deployment.

prerequisites:
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/)
- Basic understanding of Python and prompt engineering.
- Understanding of LLM and AI Agent fundamentals
- A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/) with a Linux-based OS installed.
- Familiarity with Python programming and prompt engineering techniques.
- Basic understanding of Large Language Models (LLMs) and how they are used in local inference.
- Understanding of AI agents and the OpenAI Agent SDK (or similar frameworks).

author: Andrew Choi

skilllevels: Introductory
subjects: ML
armips:
- Cortex-A76
- Cortex-A
tools_software_languages:
- Python
- IoT
- AI
- Raspberry Pi
- MCP

operatingsystems:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: Introduction to Model Context Protocol and uv
title: Introduction to Model Context Protocol (MCP) and Python uv package for local AI agents
weight: 2

### FIXED, DO NOT MODIFY
Expand All @@ -8,27 +8,50 @@ layout: learningpathall

## Model Context Protocol (MCP)

The **Model Context Protocol (MCP)** is an open specification for wiring Large-Language-Model (LLM) agents to the *context* they need — whether that context is a database, a local sensor, or a SaaS API.
Think of it as USB-C for AI: once a tool or data source speaks MCP, any compliant LLM client can “plug in” and start using it immediately.
The Model Context Protocol (MCP) is an open specification designed to connect Large Language Model (LLM) agents to the context they need — including local sensors, databases, and SaaS APIs. It enables on-device AI agents to interact with real-world data through a plug-and-play protocol that works with any LLM framework, including the OpenAI Agent SDK.

### Why use MCP?
- **Plug-and-play integrations:** A growing catalog of pre-built MCP servers (filesystem, shell, vector stores, web-scraping, etc.) gives your agent instant super-powers with zero custom glue code.
- **Plug-and-play integrations:** a growing catalog of pre-built MCP servers (such as filesystem, shell, vector stores, and web-scraping) gives your agent instant superpowers - no custom integration or glue code required.

- **Model/vendor agnostic:** Because the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.
- **Model/vendor agnostic:** as the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer.

- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data never leaves the perimeter unless you choose.
- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data stays within your infrastructure unless explicitly shared.

- **Cross-ecosystem momentum:** Recent roll-outsfrom an official C# SDK to Wix’s production MCP server and Microsoft’s Azure supportshow the MCP spec is gathering real-world traction.
- **Cross-ecosystem momentum:** recent roll-outs from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support show the MCP spec is gathering real-world traction.

### High-level architecture
![mcp server](./mcp.png)
- **MCP Host:** the LLM-powered application (Claude Desktop, an IDE plugin, OpenAI Agents SDK, etc.).
- **MCP Client:** the runtime shim that keeps a 1-to-1 connection with each server.
- **MCP Server:** a lightweight process that advertises tools (functions) over MCP.
- **Local data sources:** files, databases, or sensors your server can read directly.
- **Remote services:** external APIs the server can call on the host’s behalf.
## What is uv?

{{% notice Note %}}
Learn more about AI Agents in the [AI Agent on CPU learning path](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
`uv` is a fast, Rust-built Python package manager that simplifies dependency management. It's designed for speed and reliability, making it ideal for setting up local AI agent environments on constrained or embedded devices like the Raspberry Pi 5.

Some key features:
- Built in Rust for performance.
- Resolves dependencies and installs packages in one step.
- Optimized for local LLM workloads, embedded AI systems, and containerized Python environments.

For further information on `uv`, see: [https://github.com/astral-sh/uv](https://github.com/astral-sh/uv).


## A high-level view of the architecture

![Diagram of Model Context Protocol (MCP) architecture showing the interaction between MCP Host (LLM-powered app), MCP Client (runtime shim), and MCP Server, which connects to local data sources (files, sensors, databases) and remote APIs for AI agent context retrieval.](./mcp.png)

*Figure: High-level view of the architecture of the Model Context Protocol (MCP) for local AI agent integration with real-world data sources.*

Each component in the diagram plays a distinct role in enabling AI agents to interact with real-world context:

- The **MCP Host** is the LLM-powered application (such as Claude Desktop, an IDE plugin, or an application built with the OpenAI Agents SDK).
- The **MCP Client** is the runtime shim that keeps a 1-to-1 connection with each server.
- The **MCP Server** is a lightweight process that advertises tools (functions) over MCP.
- The **Local data sources** are files, databases, or sensors your server can read directly.
- The **Remote services** are external APIs the server can call on the host’s behalf.

{{% notice Learning Tip %}}
Learn more about AI Agents in the Learning Path [Deploy an AI Agent on Arm with llama.cpp and llama-cpp-agent using KleidiAI](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/).
{{% /notice %}}

## Section summary

This page introduces MCP and `uv` as foundational tools for building fast, secure, and modular AI agents that run efficiently on edge devices like the Raspberry Pi 5.



53 changes: 32 additions & 21 deletions content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,17 @@
---
title: Build & Run an AI Agent on your development machine
title: Build and run an AI agent on your development machine
weight: 4

### FIXED, DO NOT MODIFY
layout: learningpathall
---

In this section you will learn how to setup an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.
In this section, you'll learn how to set up an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it.

These commands were tested on an Linux Arm development machine.
These commands were tested on a Linux Arm development machine.

## Create an AI Agent and point it at your Pi's MCP Server

### Create an AI Agent and point it at your Pi's MCP Server
1. Install `uv` on your development machine:

```bash
Expand All @@ -20,27 +21,28 @@ curl -LsSf https://astral.sh/uv/install.sh | sh
```bash
mkdir mcp-agent && cd mcp-agent
```
3. Setup the directory to use `uv`:
3. Set up the directory to use `uv`:
```bash
uv init
```

This command adds:
- .venv/ (auto-created virtual environment)
- pyproject.toml (project metadata & dependencies)
- .python-version (pinned interpreter)
- README.md, .gitignore, and a sample main.py
- .venv/ (auto-created virtual environment).
- pyproject.toml (project metadata and dependencies).
- .python-version (pinned interpreter).
- README.md, .gitignore, and a sample main.py.

4. Install **OpenAI Agents SDK** + **dotenv**
4. Install **OpenAI Agents SDK** + **dotenv**:
```bash
uv add openai-agents python-dotenv
```
5. Create a `.env` file with your OpenAI key:
5. Create a `.env` file to securely store your OpenAI API key:

```bash
echo -n "OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>" > .env
```

### Write the Python script for the Agent Client
## Write the Python script for the Agent Client

Use a file editor of your choice and replace the content of the sample `main.py` with the content shown below:

Expand Down Expand Up @@ -87,15 +89,18 @@ if __name__ == "__main__":
asyncio.run(main())
```

### Execute the Agent
## Execute the Agent

You’re now ready to run the AI Agent and test its connection to your running MCP server on the Raspberry Pi 5.

You are now ready to the run the agent and test it with your running MCP server:
Run the `main.py` Python script:

Run the `main.py` python script:
```bash
uv run main.py
```
The output should look like:

The output should look something like this:

```output
Running: What is the CPU temperature?
Response: The current CPU temperature is 48.8°C.
Expand All @@ -107,11 +112,17 @@ Congratulations! Your local AI Agent just called the MCP server on your Raspberr

This lightweight protocol isn’t just a game-changer for LLM developers—it also empowers IoT engineers to transform real-world data streams and give AI direct, reliable control over any connected device.

### Next Steps
## Next Steps

- **Expand Your Toolset**
- Write additional `@mcp.tool()` functions for Pi peripherals (GPIO pins, camera, I²C sensors, etc.)
- Combine multiple MCP servers (e.g. filesystem, web-scraper, vector-store memory) for richer context
- Write additional `@mcp.tool()` functions for Pi peripherals (such as GPIO pins, camera, and I²C sensors).
- Combine multiple MCP servers (for example, filesystem, web-scraper, and vector-store memory) for richer context.

- **Integrate with IoT Platforms**
- Hook into Home Assistant or Node-RED via MCP
- Trigger real-world actions (turn on LEDs, read environmental sensors, control relays)
- Hook into Home Assistant or Node-RED through MCP.
- Trigger real-world actions (for example, turn on LEDs, read environmental sensors, and control relays).

## Section summary

You’ve now built and run an AI agent on your development machine that connects to an MCP server on your Raspberry Pi 5. Your agent can now interact with real-world data sources in real time — a complete edge-to-cloud loop powered by OpenAI’s Agent SDK and the MCP protocol.

47 changes: 26 additions & 21 deletions content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md
Original file line number Diff line number Diff line change
@@ -1,62 +1,62 @@
---
title: Set Up an MCP Server on Your Raspberry Pi
title: Set up an MCP server on Raspberry Pi 5
weight: 3

### FIXED, DO NOT MODIFY
layout: learningpathall
---

## Setup an MCP Server on Raspberry Pi 5
## Set up a FastMCP server on Raspberry Pi 5 with uv and ngrok

In this section you will learn how to:

1. Install uv (the Rust-powered Python package manager)
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data
3. Expose the MCP server to the internet with **ngrok**
1. Install uv (the Rust-powered Python package manager).
2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data.
3. Expose the local MCP server to the internet using ngrok (HTTPS tunneling service).

You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit)
You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit).

#### 1. Install uv
On Raspberry Pi Terminal, install `uv`:
In your Raspberry Pi Terminal, install `uv`:
```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```

**uv** is a next-generation, Rust-based package manager that unifies pip, virtualenv, Poetry, and more—offering 10×–100× faste
r installs, built-in virtual environment handling, robust lockfiles, and full compatibility with the Python ecosystem.
`uv` is a Rust-based, next-generation Python package manager that replaces tools like `pip`, `virtualenv`, and Poetry. It delivers 10×–100× faster installs along with built-in virtual environments, lockfile support, and full Python ecosystem compatibility.

{{% notice Note %}}
After the script finishes, restart your terminal so that the uv command is on your PATH.
{{% /notice %}}

#### 2. Bootstrap the MCP Project
1. Create a project directory and enter it:
1. Create a project directory and navigate to it:
```bash
mkdir mcp
cd mcp
```
2. Initialize with `uv`:
2. Initialize `uv`:
```bash
uv init
```
This command adds:
- .venv/ (auto-created virtual environment)
- pyproject.toml (project metadata & dependencies)
- .python-version (pinned interpreter)
- .venv/ (auto-created virtual environment).
- pyproject.toml (project metadata and dependencies).
- .python-version (pinned interpreter).
- README.md, .gitignore, and a sample main.py

3. Install the dependencies:
3. Install the dependencies (learn more about [FastMCP](https://github.com/jlowin/fastmcp)):

```bash
uv pip install fastmcp==2.2.10
uv add requests
```

#### 3. Build your MCP Server
1. Create a python file for your MCP server named `server.py`:
1. Create a Python file for your MCP server named `server.py`:
```bash
touch server.py
```
2. Use a file editor of your choice and copy the following content into `server.py`:
2. Open server.py in your preferred text editor and paste in the following code:
```bash
import subprocess, re
from mcp.server.fastmcp import FastMCP
Expand Down Expand Up @@ -95,12 +95,12 @@ if __name__ == "__main__":

#### 4. Run the MCP Server

Run the python script to deploy the MCP server:
Run the Python script to deploy the MCP server:

```python
uv run server.py
```
By default, FastMCP will listen on port 8000 and serve your tools via Server-Sent Events (SSE).
By default, FastMCP listens on port 8000 and exposes your registered tools over HTTP using Server-Sent Events (SSE).

The output should look like:

Expand All @@ -111,7 +111,7 @@ INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
```

#### 5. Install & Configure ngrok
#### 5. Install and configure ngrok

You will now use ngrok to expose your locally running MCP server to the public internet over HTTPS.

Expand All @@ -136,4 +136,9 @@ Replace `YOUR_NGROK_AUTHTOKEN` with your token from the ngrok dashboard.
```bash
ngrok http 8000
```
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`)—you’ll use this as your MCP endpoint.
4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`). You’ll use this endpoint to connect external tools or agents to your MCP server. Keep this URL available for the next steps in your workflow.

## Section summary

You now have a working FastMCP server on your Raspberry Pi 5. It includes tools for reading CPU temperature and retrieving weather data, and it's accessible over the internet via a public HTTPS endpoint using ngrok. This sets the stage for integration with LLM agents or other external tools.

Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,6 @@ tools_software_languages_filter:
- GitHub: 3
- GitLab: 1
- Himax SDK: 1
- IoT: 1
- IP Explorer: 4
- Jupyter Notebook: 1
- K3s: 1
Expand All @@ -80,7 +79,7 @@ tools_software_languages_filter:
- Python: 6
- PyTorch: 2
- QEMU: 1
- Raspberry Pi: 5
- Raspberry Pi: 6
- Remote.It: 1
- RTX: 2
- Runbook: 4
Expand Down
3 changes: 1 addition & 2 deletions content/learning-paths/iot/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,11 +32,10 @@ tools_software_languages_filter:
- Docker: 2
- Fixed Virtual Platform: 1
- GitHub: 3
- IoT: 1
- Matter: 1
- MCP: 1
- Python: 2
- Raspberry Pi: 2
- Raspberry Pi: 3
- Remote.It: 1
- VS Code: 1
---
5 changes: 2 additions & 3 deletions data/stats_current_test_info.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
summary:
content_total: 366
content_total: 369
content_with_all_tests_passing: 0
content_with_tests_enabled: 61
sw_categories:
Expand Down Expand Up @@ -63,8 +63,7 @@ sw_categories:
tests_and_status: []
aws-q-cli:
readable_title: Amazon Q Developer CLI
tests_and_status:
- ubuntu:latest: passed
tests_and_status: []
azure-cli:
readable_title: Azure CLI
tests_and_status: []
Expand Down
Loading