diff --git a/content/learning-paths/cross-platform/mcp-ai-agent/_index.md b/content/learning-paths/cross-platform/mcp-ai-agent/_index.md index 4999c1eebd..e3bd4beec0 100644 --- a/content/learning-paths/cross-platform/mcp-ai-agent/_index.md +++ b/content/learning-paths/cross-platform/mcp-ai-agent/_index.md @@ -1,35 +1,32 @@ --- -title: Deploy an MCP server on a Raspberry Pi 5 and interact with it using an AI agent +title: Deploy an MCP Server on Raspberry Pi 5 for AI Agent Interaction using OpenAI SDK -draft: true -cascade: - draft: true - minutes_to_complete: 30 -who_is_this_for: This Learning Path targets LLM and IoT developers who are familiar with Large Language Model (LLM) concepts and networking. You will learn how to deploy a lightweight Model Context Protocol (MCP) server on a Raspberry Pi 5 and interact with it via the OpenAI-Agent SDK. +who_is_this_for: This Learning Path is for LLM and IoT developers who want to run and interact with AI agents on edge devices like the Raspberry Pi 5. You'll learn how to deploy a lightweight Model Context Protocol (MCP) server and use the OpenAI Agent SDK to create and register tools for intelligent local inference. learning_objectives: - - Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5 - - Design and register custom tools for the AI Agent - - Create custom endpoints - - Learn about uv — a fast, efficient Python package manager + - Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5 for local AI agent execution. + - Use the OpenAI Agent SDK to interact with a local AI agent. + - Design and register custom tools for the agent tasks. + - Learn about uv — a fast, efficient Python package manager for efficient local deployment. prerequisites: - - A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/) - - Basic understanding of Python and prompt engineering. - - Understanding of LLM and AI Agent fundamentals + - A [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/) with a Linux-based OS installed. + - Familiarity with Python programming and prompt engineering techniques. + - Basic understanding of Large Language Models (LLMs) and how they are used in local inference. + - Understanding of AI agents and the OpenAI Agent SDK (or similar frameworks). author: Andrew Choi skilllevels: Introductory subjects: ML armips: - - Cortex-A76 + - Cortex-A tools_software_languages: - Python - - IoT - AI + - Raspberry Pi - MCP operatingsystems: diff --git a/content/learning-paths/cross-platform/mcp-ai-agent/intro-to-mcp-uv.md b/content/learning-paths/cross-platform/mcp-ai-agent/intro-to-mcp-uv.md index e2cf7da879..e1bd041ca8 100644 --- a/content/learning-paths/cross-platform/mcp-ai-agent/intro-to-mcp-uv.md +++ b/content/learning-paths/cross-platform/mcp-ai-agent/intro-to-mcp-uv.md @@ -1,5 +1,5 @@ --- -title: Introduction to Model Context Protocol and uv +title: Introduction to Model Context Protocol (MCP) and Python uv package for local AI agents weight: 2 ### FIXED, DO NOT MODIFY @@ -8,27 +8,50 @@ layout: learningpathall ## Model Context Protocol (MCP) -The **Model Context Protocol (MCP)** is an open specification for wiring Large-Language-Model (LLM) agents to the *context* they need — whether that context is a database, a local sensor, or a SaaS API. -Think of it as USB-C for AI: once a tool or data source speaks MCP, any compliant LLM client can “plug in” and start using it immediately. +The Model Context Protocol (MCP) is an open specification designed to connect Large Language Model (LLM) agents to the context they need — including local sensors, databases, and SaaS APIs. It enables on-device AI agents to interact with real-world data through a plug-and-play protocol that works with any LLM framework, including the OpenAI Agent SDK. ### Why use MCP? -- **Plug-and-play integrations:** A growing catalog of pre-built MCP servers (filesystem, shell, vector stores, web-scraping, etc.) gives your agent instant super-powers with zero custom glue code. +- **Plug-and-play integrations:** a growing catalog of pre-built MCP servers (such as filesystem, shell, vector stores, and web-scraping) gives your agent instant superpowers - no custom integration or glue code required. -- **Model/vendor agnostic:** Because the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer. +- **Model/vendor agnostic:** as the protocol lives outside the model, you can swap models like GPT-4, Claude, or your own fine-tuned model without touching the integration layer. -- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data never leaves the perimeter unless you choose. +- **Security by design:** MCP encourages running servers inside your own infrastructure, so sensitive data stays within your infrastructure unless explicitly shared. -- **Cross-ecosystem momentum:** Recent roll-outs—from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support—show the MCP spec is gathering real-world traction. +- **Cross-ecosystem momentum:** recent roll-outs from an official C# SDK to Wix’s production MCP server and Microsoft’s Azure support show the MCP spec is gathering real-world traction. -### High-level architecture -![mcp server](./mcp.png) -- **MCP Host:** the LLM-powered application (Claude Desktop, an IDE plugin, OpenAI Agents SDK, etc.). -- **MCP Client:** the runtime shim that keeps a 1-to-1 connection with each server. -- **MCP Server:** a lightweight process that advertises tools (functions) over MCP. -- **Local data sources:** files, databases, or sensors your server can read directly. -- **Remote services:** external APIs the server can call on the host’s behalf. +## What is uv? -{{% notice Note %}} -Learn more about AI Agents in the [AI Agent on CPU learning path](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/). +`uv` is a fast, Rust-built Python package manager that simplifies dependency management. It's designed for speed and reliability, making it ideal for setting up local AI agent environments on constrained or embedded devices like the Raspberry Pi 5. + +Some key features: +- Built in Rust for performance. +- Resolves dependencies and installs packages in one step. +- Optimized for local LLM workloads, embedded AI systems, and containerized Python environments. + +For further information on `uv`, see: [https://github.com/astral-sh/uv](https://github.com/astral-sh/uv). + + +## A high-level view of the architecture + + ![Diagram of Model Context Protocol (MCP) architecture showing the interaction between MCP Host (LLM-powered app), MCP Client (runtime shim), and MCP Server, which connects to local data sources (files, sensors, databases) and remote APIs for AI agent context retrieval.](./mcp.png) + +*Figure: High-level view of the architecture of the Model Context Protocol (MCP) for local AI agent integration with real-world data sources.* + +Each component in the diagram plays a distinct role in enabling AI agents to interact with real-world context: + +- The **MCP Host** is the LLM-powered application (such as Claude Desktop, an IDE plugin, or an application built with the OpenAI Agents SDK). +- The **MCP Client** is the runtime shim that keeps a 1-to-1 connection with each server. +- The **MCP Server** is a lightweight process that advertises tools (functions) over MCP. +- The **Local data sources** are files, databases, or sensors your server can read directly. +- The **Remote services** are external APIs the server can call on the host’s behalf. + +{{% notice Learning Tip %}} +Learn more about AI Agents in the Learning Path [Deploy an AI Agent on Arm with llama.cpp and llama-cpp-agent using KleidiAI](https://learn.arm.com/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/). {{% /notice %}} +## Section summary + +This page introduces MCP and `uv` as foundational tools for building fast, secure, and modular AI agents that run efficiently on edge devices like the Raspberry Pi 5. + + + diff --git a/content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md b/content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md index d99f3d61ca..4b0ccfc82f 100644 --- a/content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md +++ b/content/learning-paths/cross-platform/mcp-ai-agent/mcp-client.md @@ -1,16 +1,17 @@ --- -title: Build & Run an AI Agent on your development machine +title: Build and run an AI agent on your development machine weight: 4 ### FIXED, DO NOT MODIFY layout: learningpathall --- -In this section you will learn how to setup an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it. +In this section, you'll learn how to set up an AI Agent on your development machine. You will then connect your MCP server running on the Raspberry Pi 5 to it. -These commands were tested on an Linux Arm development machine. +These commands were tested on a Linux Arm development machine. + +## Create an AI Agent and point it at your Pi's MCP Server -### Create an AI Agent and point it at your Pi's MCP Server 1. Install `uv` on your development machine: ```bash @@ -20,27 +21,28 @@ curl -LsSf https://astral.sh/uv/install.sh | sh ```bash mkdir mcp-agent && cd mcp-agent ``` -3. Setup the directory to use `uv`: +3. Set up the directory to use `uv`: ```bash uv init ``` This command adds: -- .venv/ (auto-created virtual environment) -- pyproject.toml (project metadata & dependencies) -- .python-version (pinned interpreter) -- README.md, .gitignore, and a sample main.py +- .venv/ (auto-created virtual environment). +- pyproject.toml (project metadata and dependencies). +- .python-version (pinned interpreter). +- README.md, .gitignore, and a sample main.py. -4. Install **OpenAI Agents SDK** + **dotenv** +4. Install **OpenAI Agents SDK** + **dotenv**: ```bash uv add openai-agents python-dotenv ``` -5. Create a `.env` file with your OpenAI key: +5. Create a `.env` file to securely store your OpenAI API key: + ```bash echo -n "OPENAI_API_KEY=" > .env ``` -### Write the Python script for the Agent Client +## Write the Python script for the Agent Client Use a file editor of your choice and replace the content of the sample `main.py` with the content shown below: @@ -87,15 +89,18 @@ if __name__ == "__main__": asyncio.run(main()) ``` -### Execute the Agent +## Execute the Agent + +You’re now ready to run the AI Agent and test its connection to your running MCP server on the Raspberry Pi 5. -You are now ready to the run the agent and test it with your running MCP server: +Run the `main.py` Python script: -Run the `main.py` python script: ```bash uv run main.py ``` -The output should look like: + +The output should look something like this: + ```output Running: What is the CPU temperature? Response: The current CPU temperature is 48.8°C. @@ -107,11 +112,17 @@ Congratulations! Your local AI Agent just called the MCP server on your Raspberr This lightweight protocol isn’t just a game-changer for LLM developers—it also empowers IoT engineers to transform real-world data streams and give AI direct, reliable control over any connected device. -### Next Steps +## Next Steps + - **Expand Your Toolset** - - Write additional `@mcp.tool()` functions for Pi peripherals (GPIO pins, camera, I²C sensors, etc.) - - Combine multiple MCP servers (e.g. filesystem, web-scraper, vector-store memory) for richer context + - Write additional `@mcp.tool()` functions for Pi peripherals (such as GPIO pins, camera, and I²C sensors). + - Combine multiple MCP servers (for example, filesystem, web-scraper, and vector-store memory) for richer context. - **Integrate with IoT Platforms** - - Hook into Home Assistant or Node-RED via MCP - - Trigger real-world actions (turn on LEDs, read environmental sensors, control relays) + - Hook into Home Assistant or Node-RED through MCP. + - Trigger real-world actions (for example, turn on LEDs, read environmental sensors, and control relays). + +## Section summary + +You’ve now built and run an AI agent on your development machine that connects to an MCP server on your Raspberry Pi 5. Your agent can now interact with real-world data sources in real time — a complete edge-to-cloud loop powered by OpenAI’s Agent SDK and the MCP protocol. + diff --git a/content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md b/content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md index 93e163b9d6..caada03d31 100644 --- a/content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md +++ b/content/learning-paths/cross-platform/mcp-ai-agent/mcp-server.md @@ -1,62 +1,62 @@ --- -title: Set Up an MCP Server on Your Raspberry Pi +title: Set up an MCP server on Raspberry Pi 5 weight: 3 ### FIXED, DO NOT MODIFY layout: learningpathall --- -## Setup an MCP Server on Raspberry Pi 5 +## Set up a FastMCP server on Raspberry Pi 5 with uv and ngrok In this section you will learn how to: -1. Install uv (the Rust-powered Python package manager) -2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data -3. Expose the MCP server to the internet with **ngrok** +1. Install uv (the Rust-powered Python package manager). +2. Bootstrap a simple MCP server on your Raspberry Pi 5 that reads the CPU temperature and searches the weather data. +3. Expose the local MCP server to the internet using ngrok (HTTPS tunneling service). -You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit) +You will run all the commands shown below on your Raspberry Pi 5 running Raspberry Pi OS (64-bit). #### 1. Install uv -On Raspberry Pi Terminal, install `uv`: +In your Raspberry Pi Terminal, install `uv`: ```bash curl -LsSf https://astral.sh/uv/install.sh | sh ``` -**uv** is a next-generation, Rust-based package manager that unifies pip, virtualenv, Poetry, and more—offering 10×–100× faste -r installs, built-in virtual environment handling, robust lockfiles, and full compatibility with the Python ecosystem. +`uv` is a Rust-based, next-generation Python package manager that replaces tools like `pip`, `virtualenv`, and Poetry. It delivers 10×–100× faster installs along with built-in virtual environments, lockfile support, and full Python ecosystem compatibility. {{% notice Note %}} After the script finishes, restart your terminal so that the uv command is on your PATH. {{% /notice %}} #### 2. Bootstrap the MCP Project -1. Create a project directory and enter it: +1. Create a project directory and navigate to it: ```bash mkdir mcp cd mcp ``` -2. Initialize with `uv`: +2. Initialize `uv`: ```bash uv init ``` This command adds: -- .venv/ (auto-created virtual environment) -- pyproject.toml (project metadata & dependencies) -- .python-version (pinned interpreter) +- .venv/ (auto-created virtual environment). +- pyproject.toml (project metadata and dependencies). +- .python-version (pinned interpreter). - README.md, .gitignore, and a sample main.py -3. Install the dependencies: +3. Install the dependencies (learn more about [FastMCP](https://github.com/jlowin/fastmcp)): + ```bash uv pip install fastmcp==2.2.10 uv add requests ``` #### 3. Build your MCP Server -1. Create a python file for your MCP server named `server.py`: +1. Create a Python file for your MCP server named `server.py`: ```bash touch server.py ``` -2. Use a file editor of your choice and copy the following content into `server.py`: +2. Open server.py in your preferred text editor and paste in the following code: ```bash import subprocess, re from mcp.server.fastmcp import FastMCP @@ -95,12 +95,12 @@ if __name__ == "__main__": #### 4. Run the MCP Server -Run the python script to deploy the MCP server: +Run the Python script to deploy the MCP server: ```python uv run server.py ``` -By default, FastMCP will listen on port 8000 and serve your tools via Server-Sent Events (SSE). +By default, FastMCP listens on port 8000 and exposes your registered tools over HTTP using Server-Sent Events (SSE). The output should look like: @@ -111,7 +111,7 @@ INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit) ``` -#### 5. Install & Configure ngrok +#### 5. Install and configure ngrok You will now use ngrok to expose your locally running MCP server to the public internet over HTTPS. @@ -136,4 +136,9 @@ Replace `YOUR_NGROK_AUTHTOKEN` with your token from the ngrok dashboard. ```bash ngrok http 8000 ``` -4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`)—you’ll use this as your MCP endpoint. +4. Copy the generated HTTPS URL (e.g. `https://abcd1234.ngrok-free.app`). You’ll use this endpoint to connect external tools or agents to your MCP server. Keep this URL available for the next steps in your workflow. + +## Section summary + +You now have a working FastMCP server on your Raspberry Pi 5. It includes tools for reading CPU temperature and retrieving weather data, and it's accessible over the internet via a public HTTPS endpoint using ngrok. This sets the stage for integration with LLM agents or other external tools. + diff --git a/content/learning-paths/embedded-and-microcontrollers/_index.md b/content/learning-paths/embedded-and-microcontrollers/_index.md index bf67579ea7..6cf07146bb 100644 --- a/content/learning-paths/embedded-and-microcontrollers/_index.md +++ b/content/learning-paths/embedded-and-microcontrollers/_index.md @@ -61,7 +61,6 @@ tools_software_languages_filter: - GitHub: 3 - GitLab: 1 - Himax SDK: 1 -- IoT: 1 - IP Explorer: 4 - Jupyter Notebook: 1 - K3s: 1 @@ -80,7 +79,7 @@ tools_software_languages_filter: - Python: 6 - PyTorch: 2 - QEMU: 1 -- Raspberry Pi: 5 +- Raspberry Pi: 6 - Remote.It: 1 - RTX: 2 - Runbook: 4 diff --git a/content/learning-paths/iot/_index.md b/content/learning-paths/iot/_index.md index bf5ad029cd..221cec7ff4 100644 --- a/content/learning-paths/iot/_index.md +++ b/content/learning-paths/iot/_index.md @@ -32,11 +32,10 @@ tools_software_languages_filter: - Docker: 2 - Fixed Virtual Platform: 1 - GitHub: 3 -- IoT: 1 - Matter: 1 - MCP: 1 - Python: 2 -- Raspberry Pi: 2 +- Raspberry Pi: 3 - Remote.It: 1 - VS Code: 1 --- diff --git a/data/stats_current_test_info.yml b/data/stats_current_test_info.yml index acc994b7d8..cad3fceeee 100644 --- a/data/stats_current_test_info.yml +++ b/data/stats_current_test_info.yml @@ -1,5 +1,5 @@ summary: - content_total: 366 + content_total: 369 content_with_all_tests_passing: 0 content_with_tests_enabled: 61 sw_categories: @@ -63,8 +63,7 @@ sw_categories: tests_and_status: [] aws-q-cli: readable_title: Amazon Q Developer CLI - tests_and_status: - - ubuntu:latest: passed + tests_and_status: [] azure-cli: readable_title: Azure CLI tests_and_status: [] diff --git a/data/stats_weekly_data.yml b/data/stats_weekly_data.yml index d55fd633ed..178781f649 100644 --- a/data/stats_weekly_data.yml +++ b/data/stats_weekly_data.yml @@ -5903,3 +5903,109 @@ avg_close_time_hrs: 0 num_issues: 9 percent_closed_vs_total: 0.0 +- a_date: '2025-05-26' + content: + automotive: 2 + cross-platform: 32 + embedded-and-microcontrollers: 41 + install-guides: 101 + iot: 6 + laptops-and-desktops: 37 + mobile-graphics-and-gaming: 33 + servers-and-cloud-computing: 117 + total: 369 + contributions: + external: 94 + internal: 482 + github_engagement: + num_forks: 30 + num_prs: 6 + individual_authors: + adnan-alsinan: 1 + alaaeddine-chakroun: 2 + albin-bernhardsson: 1 + alex-su: 1 + alexandros-lamprineas: 1 + andrew-choi: 1 + annie-tallund: 4 + arm: 3 + arnaud-de-grandmaison: 4 + arnaud-de-grandmaison.: 1 + avin-zarlez: 1 + barbara-corriero: 1 + basma-el-gaabouri: 1 + ben-clark: 1 + bolt-liu: 2 + brenda-strech: 1 + chaodong-gong: 1 + chen-zhang: 1 + christophe-favergeon: 1 + christopher-seidl: 7 + cyril-rohr: 1 + daniel-gubay: 1 + daniel-nguyen: 2 + david-spickett: 2 + dawid-borycki: 33 + diego-russo: 2 + dominica-abena-o.-amanfo: 1 + elham-harirpoush: 2 + florent-lebeau: 5 + "fr\xE9d\xE9ric--lefred--descamps": 2 + gabriel-peterson: 5 + gayathri-narayana-yegna-narayanan: 1 + georgios-mermigkis: 1 + geremy-cohen: 1 + graham-woodward: 1 + han-yin: 1 + iago-calvo-lista: 1 + james-whitaker: 1 + jason-andrews: 102 + joe-stech: 4 + johanna-skinnider: 2 + jonathan-davies: 2 + jose-emilio-munoz-lopez: 1 + julie-gaskin: 5 + julio-suarez: 6 + jun-he: 1 + kasper-mecklenburg: 1 + kieran-hejmadi: 9 + koki-mitsunami: 2 + konstantinos-margaritis: 8 + kristof-beyls: 1 + leandro-nunes: 1 + liliya-wu: 1 + mark-thurman: 1 + masoud-koleini: 1 + mathias-brossard: 1 + michael-hall: 5 + na-li: 1 + nader-zouaoui: 2 + nikhil-gupta: 1 + nina-drozd: 1 + nobel-chowdary-mandepudi: 6 + odin-shen: 7 + owen-wu: 2 + pareena-verma: 44 + paul-howard: 3 + pranay-bakre: 5 + preema-merlin-dsouza: 1 + przemyslaw-wirkus: 2 + rin-dobrescu: 1 + roberto-lopez-mendez: 2 + ronan-synnott: 45 + shuheng-deng: 1 + thirdai: 1 + tianyu-li: 2 + tom-pilar: 1 + uma-ramalingam: 1 + varun-chari: 2 + visualsilicon: 1 + willen-yang: 1 + ying-yu: 2 + yiyang-fan: 1 + zach-lasiuk: 2 + zhengjun-xing: 2 + issues: + avg_close_time_hrs: 0 + num_issues: 9 + percent_closed_vs_total: 0.0 diff --git a/themes/arm-design-system-hugo-theme/layouts/_default/index.coveo.xml b/themes/arm-design-system-hugo-theme/layouts/_default/index.coveo.xml index 213300f51d..be2a6be300 100644 --- a/themes/arm-design-system-hugo-theme/layouts/_default/index.coveo.xml +++ b/themes/arm-design-system-hugo-theme/layouts/_default/index.coveo.xml @@ -173,7 +173,7 @@ {{- end -}} {{- if and (.File) (in .File.Path "learning-paths") -}} - + {{- if .IsSection -}} {{- with .Title -}} {{ . }} @@ -187,9 +187,9 @@ Unknown Parent Title {{- end -}} {{- end -}} - - 1 - {{.Params.weight -}} + + 1 + {{.Params.weight -}} {{- end -}}