From 2139a5c7afecf2d9d92784e375999be0fdb10222 Mon Sep 17 00:00:00 2001 From: Maddy Underwood <167196745+madeline-underwood@users.noreply.github.com> Date: Tue, 19 Aug 2025 21:08:25 +0000 Subject: [PATCH 1/3] First-pass editorial --- .../raspberry-pi-smart-home/1-overview.md | 57 +++++++------- .../2-software-dependencies.md | 52 +++++++------ .../raspberry-pi-smart-home/3-test-gpio.md | 31 ++++---- .../4-smart-home-assistant.md | 11 ++- .../raspberry-pi-smart-home/_index.md | 76 +++++++++---------- 5 files changed, 114 insertions(+), 113 deletions(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md index 3fa8ad30dd..2b5bede89c 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md @@ -8,36 +8,36 @@ layout: learningpathall ## Overview -This Learning Path walks you through deploying an efficient large language model (LLM) locally on the Raspberry Pi 5, powered by an Arm Cortex-A76 CPU. This will allow you to control your smart home using natural language, without relying on cloud services. With rapid advances in Generative AI and the power of Arm Cortex-A processors, you can now run advanced language models directly in your home on the Raspberry Pi 5. +This Learning Path walks you through deploying an efficient large language model (LLM) locally on the Raspberry Pi 5, powered by an Arm Cortex-A76 CPU. This setup enables you to control your smart home using natural language without relying on cloud services. With rapid advances in generative AI and the power of Arm Cortex-A processors, you can now run advanced language models directly in your home on the Raspberry Pi 5. -You will create a fully local, privacy-first smart home system that leverages the strengths of Arm Cortex-A architecture. The system can achieve 15+ tokens per second inference speeds using optimized models like TinyLlama and Qwen, while maintaining the energy efficiency that makes Arm processors a good fit for always-on applications. +You will create a fully local, privacy-first smart home system that leverages the strengths of Arm Cortex-A architecture. The system can achieve 15+ tokens per second inference speeds using optimized models like TinyLlama and Qwen, while maintaining the energy efficiency that makes Arm processors well suited for always-on applications. -## Why Arm Cortex-A for Edge AI? +## Why Arm Cortex-A for edge AI? The Raspberry Pi 5's Arm Cortex-A76 processor can manage high-performance computing tasks like AI inference. Key architectural features include: -- The **superscalar architecture** allows the processor to execute multiple instructions in parallel, improving throughput for compute-heavy tasks. -- **128-bit NEON SIMD support** accelerates matrix and vector operations, which are common in the inner loops of language model inference. -- The **multi-level cache hierarchy** helps reduce memory latency and improves data access efficiency during runtime. -- The **thermal efficiency** enables sustained performance without active cooling, making it ideal for compact or always-on smart home setups. +- **Superscalar architecture**: Executes multiple instructions in parallel, improving throughput for compute-heavy tasks +- **128-bit NEON SIMD support**: Accelerates matrix and vector operations, common in the inner loops of language model inference +- **Multi-level cache hierarchy**: Reduces memory latency and improves data access efficiency during runtime +- **Thermal efficiency**: Enables sustained performance without active cooling, making it ideal for compact or always-on smart home setups -These characteristics make the Raspberry Pi 5 well-suited for workloads like smart home assistants, where responsiveness, efficiency, and local processing are important. Running LLMs locally on Arm-based devices brings several practical benefits. Privacy is preserved, since conversations and routines never leave the device. With optimized inference, the system can offer responsiveness under 100 ms, even on resource-constrained hardware. It remains fully functional in offline scenarios, continuing to operate when internet access is unavailable. Developers also gain flexibility to customize models and automations. Additionally, software updates and an active ecosystem continue to improve performance over time. +These characteristics make the Raspberry Pi 5 well suited for workloads like smart home assistants, where responsiveness, efficiency, and local processing are important. Running LLMs locally on Arm-based devices brings several practical benefits. Privacy is preserved, since conversations and routines never leave the device. With optimized inference, the system can offer responsiveness under 100 ms, even on resource-constrained hardware. It remains fully functional in offline scenarios, continuing to operate when internet access is unavailable. Developers also gain flexibility to customize models and automations. Additionally, software updates and an active ecosystem continue to improve performance over time. -## Arm Ecosystem Advantages +## Arm ecosystem advantages For the stack in this setup, Raspberry Pi 5 benefits from the extensive developer ecosystem: - Optimized compilers including GCC and Clang with Arm-specific enhancements - Native libraries such as gpiozero and lgpio are optimized for Raspberry Pi -- Community support from open-source projects where developers are contributing Arm-optimized code -- Arm maintains a strong focus on backward compatibility, which reduces friction when updating kernels or deploying across multiple Arm platforms +- Community support from open-source projects where developers contribute Arm-optimized code +- Backward compatibility in Arm architecture reduces friction when updating kernels or deploying across platforms - The same architecture powers smartphones, embedded controllers, edge devices, and cloud infrastructure—enabling consistent development practices across domains -## Performance Benchmarks on Raspberry Pi 5 +## Performance benchmarks on Raspberry Pi 5 The table below shows inference performance for several quantized models running on a Raspberry Pi 5. Measurements reflect single-threaded CPU inference with typical prompt lengths and temperature settings suitable for command-based interaction. -| Model | Tokens/Sec | Avg Latency (ms) | +| Model | Tokens/sec | Avg latency (ms) | | ------------------- | ---------- | ---------------- | | qwen:0.5b | 17.0 | 8,217 | | tinyllama:1.1b | 12.3 | 9,429 | @@ -45,29 +45,28 @@ The table below shows inference performance for several quantized models running | gemma2:2b | 4.1 | 23,758 | | deepseek-r1:7b | 1.6 | 64,797 | +### Insights -What does this table tell us? Here are some performance insights: +- Qwen 0.5B and TinyLlama 1.1B deliver fast token generation and low average latency, making them suitable for real-time interactions such as voice-controlled smart home commands +- DeepSeek-Coder 1.3B and Gemma 2B trade some speed for improved language understanding, which can be useful for complex tasks or context-aware prompts +- DeepSeek-R1 7B offers advanced reasoning capabilities with acceptable latency, which may be viable for offline summarization, planning, or low-frequency tasks -- Qwen 0.5B and TinyLlama 1.1B deliver fast token generation and low average latency, making them suitable for real-time interactions like voice-controlled smart home commands. -- DeepSeek-Coder 1.3B and Gemma 2B trade off some speed for improved language understanding, which can be useful for more complex task execution or context-aware prompts. -- DeepSeek-R1 7B offers advanced reasoning capabilities with acceptable latency, which may be viable for offline summarization, planning, or low-frequency tasks. +## Supported Arm-powered devices -## Supported Arm-Powered Devices +This Learning Path focuses on the Raspberry Pi 5, but you can adapt the concepts and code to other Arm-powered devices. -This Learning Path focuses on the Raspberry Pi 5, but you can adapt the concepts and code to other Arm-powered devices: +### Recommended platforms -### Recommended Platforms +| Platform | CPU | RAM | GPIO support | Model size suitability | +| ------------------- | -------------------------------- | -------------- | ------------------------------ | --------------------------- | +| **Raspberry Pi 5** | Arm Cortex-A76 quad-core @ 2.4GHz | Up to 16GB | Native `lgpio` (high-performance) | Large models (8–16GB) | +| **Raspberry Pi 4** | Arm Cortex-A72 quad-core @ 1.8GHz | Up to 8GB | Compatible with `gpiozero` | Small to mid-size models | +| **Other Arm devices** | Arm Cortex-A | 4GB min (8GB+ recommended) | Requires physical GPIO pins | Varies by RAM | -| Platform | CPU | RAM | GPIO Support | Model Size Suitability | -|------------------|----------------------------------|----------------|-------------------------------|-----------------------------| -| **Raspberry Pi 5** | Arm Cortex-A76 quad-core @ 2.4GHz | Up to 16GB | Native `lgpio` (high-performance) | Large models (8–16GB) | -| **Raspberry Pi 4** | Arm Cortex-A72 quad-core @ 1.8GHz | Up to 8GB | Compatible with `gpiozero` | Small to mid-size models | -| **Other Arm Devices** | Arm Cortex-A | 4GB min (8GB+ recommended) | Requires physical GPIO pins | Varies by RAM | - -Additionally, the platform must: +Additionally, the platform must meet the following requirements: - GPIO pins available for hardware control -- Use Python 3.8 or newer +- Python 3.8 or newer - Ability to run [Ollama](https://ollama.com/) -Continue to the next section to start building a smart home system that highlights how Arm-based processors can enable efficient, responsive, and private AI applications at the edge. \ No newline at end of file +Continue to the next section to start building a smart home system that highlights how Arm-based processors enable efficient, responsive, and private AI applications at the edge. diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md index ab97b85829..a9799cfe48 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md @@ -10,15 +10,15 @@ layout: learningpathall This guide assumes you have set up your Raspberry Pi with Raspberry Pi OS and network connectivity. For Raspberry Pi 5 setup help, see: [Raspberry Pi Getting Started](https://www.raspberrypi.com/documentation/) {{% /notice %}} -## Connect to Your Raspberry Pi 5 +## Connect to your Raspberry Pi 5 -### Option 1: Using a display +### Option 1: Use a display -The easiest way to work on your Raspberry Pi is connecting it to an external display through one of the micro HDMI ports. This setup also requires a keyboard and mouse to navigate. +The easiest way to work on your Raspberry Pi is by connecting it to an external display through one of the micro‑HDMI ports. This setup also requires a keyboard and mouse. -### Option 2: Using SSH +### Option 2: Use SSH -You can also use SSH to access the terminal. To use this approach you need to know the IP address of your device. Ensure your Raspberry Pi 5 connects to the same network as your host computer. Access your device remotely via SSH using the terminal or any SSH client. +You can also use SSH to access the terminal. To use this approach, you need to know the IP address of your device. Ensure your Raspberry Pi 5 is on the same network as your host computer. Access your device remotely via SSH using the terminal or any SSH client. Replace `` with your Pi's username (typically `pi`), and `` with your Raspberry Pi 5's IP address. @@ -26,38 +26,38 @@ Replace `` with your Pi's username (typically `pi`), and `` with yo ssh @ ``` -## Set up the dependencies +## Set up dependencies Create a directory called `smart-home` in your home directory and navigate into it: ```bash -mkdir $HOME/smart-home -cd $HOME/smart-home +mkdir -p "$HOME/smart-home" +cd "$HOME/smart-home" ``` -The Raspberry Pi 5 includes Python 3 pre-installed, but you need additional packages: +The Raspberry Pi 5 includes Python 3 preinstalled, but you need additional packages: ```bash -sudo apt update && sudo apt upgrade -sudo apt install python3 python3-pip python3-venv git curl build-essential gcc python3-lgpio +sudo apt update && sudo apt upgrade -y +sudo apt install -y python3 python3-pip python3-venv git curl build-essential gcc python3-lgpio ``` -### Configure the virtual environment +### Configure a virtual environment -The next step is to create and activate a Python virtual environment. This approach keeps project dependencies isolated and prevents conflicts with system-wide packages: +Create and activate a Python virtual environment to isolate project dependencies: ```bash python3 -m venv venv source venv/bin/activate ``` -Install all required libraries and dependencies: +Install the required libraries: ```bash -pip install ollama gpiozero lgpio psutil httpx orjson numpy fastapi uvicorn uvloop numpy +pip install ollama gpiozero lgpio psutil httpx orjson numpy fastapi uvicorn uvloop ``` -### Install Ollama +## Install Ollama Install Ollama using the official installation script for Linux: @@ -70,27 +70,29 @@ Verify the installation: ```bash ollama --version ``` -If installation was successful, the output from the command should match that below. + +If installation was successful, the output should be similar to: + ```output ollama version is 0.11.4 ``` -## Download and Test a Language Model +## Download and test a language model -Ollama supports various models. This guide uses deepseek-r1:7b as an example, but you can also use `tinyllama:1.1b`, `qwen:0.5b`, `gemma2:2b`, or `deepseek-coder:1.3b`. +Ollama supports various models. This guide uses `deepseek-r1:7b` as an example, but you can also use `tinyllama:1.1b`, `qwen:0.5b`, `gemma2:2b`, or `deepseek-coder:1.3b`. -The `run` command will set up the model automatically. You will see download progress in the terminal, followed by the interactive prompt when ready. +The `run` command sets up the model automatically. You will see download progress in the terminal, followed by an interactive prompt when ready. ```bash ollama run deepseek-r1:7b ``` {{% notice Troubleshooting %}} -If you run into issues with the model download, here are some things to check: +If you run into issues with the model download, try the following: -- Confirm internet access and sufficient storage space on your microSD card -- Try downloading smaller models like `qwen:0.5b` or `tinyllama:1.1b` if you encounter memory issues. 16 GB of RAM is sufficient for running smaller to medium-sized language models. Very large models may require more memory or run slower. -- Clear storage or connect to a more stable network if errors occur +- Confirm internet access and sufficient storage space on your microSD card. +- Try smaller models like `qwen:0.5b` or `tinyllama:1.1b` if you encounter memory issues. 16 GB of RAM is sufficient for small to medium models; very large models may require more memory or run slower. +- Clear storage or connect to a more stable network if errors occur. {{% /notice %}} -With the model set up through `ollama`, move on to the next section to start configuring the hardware. \ No newline at end of file +With the model set up through Ollama, move on to the next section to start configuring the hardware. diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md index 4d7efe7a2e..e6248ccac3 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md @@ -6,11 +6,11 @@ weight: 4 layout: learningpathall --- -The next step is to test the GPIO functionality. In this section, you will configure a LED light to simulate a smart-home device. +The next step is to test the GPIO functionality. In this section, you configure an LED light to simulate a smart home device. -## Verify GPIO Functionality +## Verify GPIO functionality -Bring out your electronics components. Connect the anode (long leg) of an LED in series with a 220Ω resistor to GPIO 17 (physical pin 11). Connect the cathode (short leg) to a ground (GND) pin. See image below for the full setup: +Gather your electronic components. Connect the anode (long leg) of an LED in series with a 220Ω resistor to GPIO 17 (physical pin 11). Connect the cathode (short leg) to a ground (GND) pin. See the image below for the full setup: ![Raspberry Pi connected to a breadboard with a green LED and jumper wires](pin_layout.jpg "Raspberry Pi connected to a breadboard with a green LED and jumper wires") @@ -21,7 +21,7 @@ cd $HOME/smart-home vim testgpio.py ``` -Copy this code into the file: +Add the following code to the file: ```python #!/usr/bin/env python3 @@ -32,7 +32,7 @@ from gpiozero.pins.lgpio import LGPIOFactory # Set lgpio backend for Raspberry Pi 5 Device.pin_factory = LGPIOFactory() -# Setup GPIO pin 17 +# Set up GPIO pin 17 pin1 = LED(17) try: @@ -52,19 +52,20 @@ python testgpio.py The LED should blink every two seconds. If you observe this behavior, your GPIO setup works correctly. {{% notice Troubleshooting %}} -If you run into issues with the hardware setup, here are some things to check: -- Try fixing missing dependencies by running the following command: -```bash -sudo apt-get install -f -``` -- If you're running into GPIO permission issues, run Python scripts with `sudo` or add your user to the `gpio` group. Don't forget to log out for the changes to take effect. -```bash -sudo usermod -a -G gpio $USER -``` +If you run into issues with the hardware setup, check the following: + +- Fix missing dependencies with: + ```bash + sudo apt-get install -f + ``` +- If you encounter GPIO permission issues, run Python scripts with `sudo` or add your user to the `gpio` group. Don’t forget to log out for the changes to take effect: + ```bash + sudo usermod -a -G gpio $USER + ``` - Double-check wiring and pin numbers using the Raspberry Pi 5 pinout diagram - Ensure proper LED and resistor connections - Verify GPIO enablement in `raspi-config` if needed - Use a high-quality power supply {{% /notice %}} -With a way to control devices using GPIO pins, you can move on to the next section to interact with them using language models and the user interface. \ No newline at end of file +With GPIO pins working, you can now move on to the next section to interact with devices using language models and the user interface. diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md index 89fac4bc34..bcc5a40864 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md @@ -57,7 +57,7 @@ python3 smart_home_assistant.py --no-api {{< /tab >}} {{< /tabpane >}} -### Command Options +### Command options | Option | Description | Example | |------------------|---------------------------------------------------------------------------------------------------|--------------------------------------------| @@ -69,15 +69,15 @@ If everything is set up correctly, you should see the following output on runnin ![Running in Default Mode](cmd.png "Running the code in default mode") -## Interact With Your Assistant +## Interact with your assistant Try asking the assistant to `turn on living room light`. If you've connected additional devices, come up with prompts to test the setup. ### Web interface - Open your browser and navigate to `http://0.0.0.0:8000`, or as printed in the terminal output. +Open your browser and navigate to `http://0.0.0.0:8000`, or as printed in the terminal output. - ![Web Interface Interaction](UI3.png "Interacting with the LLM through the web interface") +![Web Interface Interaction](UI3.png "Interacting with the LLM through the web interface") ### Command line interface @@ -105,9 +105,8 @@ If you're running into issues with the assistant, here are some things to check: - If port 8000 is unavailable, run the assistant with a different port using the `--port` flag. {{% /notice %}} -## Wrapping up +## Wrap up From here, you can modify the `smart_home_assistant.py` and extend the system by adding more devices, experimenting with conversational commands, or integrating sensors and automation logic into your smart home setup. You should now know more about setting up a Raspberry Pi 5 to control real-world devices using GPIO pins, and running a smart home assistant powered by local language models through Ollama. You’ve learned how to wire basic circuits with LEDs and resistors to simulate smart devices, and how to launch and interact with the assistant through both the command-line interface and a web dashboard. Along the way, you also explored common troubleshooting steps for GPIO access, missing dependencies, and model loading issues. - diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md index 2634080a6e..4da40caa9a 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md @@ -6,58 +6,58 @@ minutes_to_complete: 45 who_is_this_for: This is an introductory topic for developers interested in building smart home systems using on-device LLMs and Arm-based edge platforms like the Raspberry Pi 5. learning_objectives: - - "Understand how the Arm architecture enables efficient, private, and responsive LLM inference" - - "Run a smart home assistant on Raspberry Pi 5 with local LLM integration" - - "Wire and control physical devices (e.g., LEDs) using Raspberry Pi GPIO pins" - - "Deploy and interact with a local language model using Ollama" - - "Launch and access a web-based dashboard for device control" + - Understand how the Arm architecture enables efficient, private, and responsive LLM inference + - Run a smart home assistant on Raspberry Pi 5 with local LLM integration + - Wire and control physical devices (for example, LEDs) using Raspberry Pi GPIO pins + - Deploy and interact with a local language model using Ollama + - Launch and access a web-based dashboard for device control + prerequisites: - - "An Arm-based single board computer (e.g., Raspberry Pi 5 running Raspberry Pi OS)" - - "Basic electronic components: breadboard, LEDs, resistors and jumper wires" - - "Basic understanding of Python, GPIO pins and electronics" + - An Arm-based single board computer (for example, Raspberry Pi 5 running Raspberry Pi OS) + - Basic electronic components - breadboard, LEDs, resistors, and jumper wires + - Basic understanding of Python, GPIO pins, and electronics -author: "Fidel Makatia Omusilibwa" +author: Fidel Makatia Omusilibwa -### Tags -skilllevels: "Introductory" -subjects: "ML" +skilllevels: Introductory +subjects: ML armips: - - "Cortex-A" + - Cortex-A tools_software_languages: - - "Python" - - "Ollama" - - "gpiozero" - - "lgpio" - - "FastAPI" - - "Raspberry Pi" + - Python + - Ollama + - gpiozero + - lgpio + - FastAPI + - Raspberry Pi operatingsystems: - - "Linux" + - Linux further_reading: - resource: - title: "Raspberry Pi 5 Smart Home Assistant with EdgeAI" - link: "https://github.com/fidel-makatia/EdgeAI_Raspi5" - type: "source" + title: Raspberry Pi 5 Smart Home Assistant with EdgeAI + link: https://github.com/fidel-makatia/EdgeAI_Raspi5 + type: source - resource: - title: "Ollama Python/JavaScript Libraries" - link: "https://ollama.com/blog/python-javascript-libraries" - type: "documentation" + title: Ollama Python/JavaScript Libraries + link: https://ollama.com/blog/python-javascript-libraries + type: documentation - resource: - title: "gpiozero Documentation for Raspberry Pi" - link: "https://gpiozero.readthedocs.io/en/stable/" - type: "documentation" + title: gpiozero Documentation for Raspberry Pi + link: https://gpiozero.readthedocs.io/en/stable/ + type: documentation - resource: - title: "lgpio Library for Raspberry Pi 5" - link: "https://abyz.me.uk/lg/lgpio.html" - type: "documentation" + title: lgpio Library for Raspberry Pi 5 + link: https://abyz.me.uk/lg/lgpio.html + type: documentation - resource: - title: "Raspberry Pi 5 Official Documentation" - link: "https://www.raspberrypi.org/documentation/computers/raspberry-pi.html" - type: "documentation" + title: Raspberry Pi 5 Official Documentation + link: https://www.raspberrypi.org/documentation/computers/raspberry-pi.html + type: documentation - resource: - title: "Ollama Model Library" - link: "https://ollama.com/library" - type: "documentation" + title: Ollama Model Library + link: https://ollama.com/library + type: documentation ### FIXED, DO NOT MODIFY # ================================================================================ From b22b74c4b74a7fd89a290be4e0ae642c5bed7f45 Mon Sep 17 00:00:00 2001 From: Maddy Underwood <167196745+madeline-underwood@users.noreply.github.com> Date: Tue, 19 Aug 2025 21:29:58 +0000 Subject: [PATCH 2/3] SEO optimization --- .../raspberry-pi-smart-home/1-overview.md | 11 ++++++----- .../2-software-dependencies.md | 10 +++++++--- .../raspberry-pi-smart-home/3-test-gpio.md | 4 ++-- .../raspberry-pi-smart-home/4-smart-home-assistant.md | 6 +++--- .../raspberry-pi-smart-home/_index.md | 6 +++--- 5 files changed, 21 insertions(+), 16 deletions(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md index 2b5bede89c..0d105ae321 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md @@ -1,5 +1,6 @@ --- -title: Overview +title: Run LLMs locally on Raspberry Pi 5 for Edge AI + weight: 2 ### FIXED, DO NOT MODIFY @@ -12,7 +13,7 @@ This Learning Path walks you through deploying an efficient large language model You will create a fully local, privacy-first smart home system that leverages the strengths of Arm Cortex-A architecture. The system can achieve 15+ tokens per second inference speeds using optimized models like TinyLlama and Qwen, while maintaining the energy efficiency that makes Arm processors well suited for always-on applications. -## Why Arm Cortex-A for edge AI? +## Why Arm Cortex-A76 makes Raspberry Pi 5 ideal for Edge AI The Raspberry Pi 5's Arm Cortex-A76 processor can manage high-performance computing tasks like AI inference. Key architectural features include: @@ -23,7 +24,7 @@ The Raspberry Pi 5's Arm Cortex-A76 processor can manage high-performance comput These characteristics make the Raspberry Pi 5 well suited for workloads like smart home assistants, where responsiveness, efficiency, and local processing are important. Running LLMs locally on Arm-based devices brings several practical benefits. Privacy is preserved, since conversations and routines never leave the device. With optimized inference, the system can offer responsiveness under 100 ms, even on resource-constrained hardware. It remains fully functional in offline scenarios, continuing to operate when internet access is unavailable. Developers also gain flexibility to customize models and automations. Additionally, software updates and an active ecosystem continue to improve performance over time. -## Arm ecosystem advantages +## Leverage the Arm ecosystem for Raspberry Pi edge AI For the stack in this setup, Raspberry Pi 5 benefits from the extensive developer ecosystem: @@ -45,7 +46,7 @@ The table below shows inference performance for several quantized models running | gemma2:2b | 4.1 | 23,758 | | deepseek-r1:7b | 1.6 | 64,797 | -### Insights +### LLM benchmark insights on Raspberry Pi 5 - Qwen 0.5B and TinyLlama 1.1B deliver fast token generation and low average latency, making them suitable for real-time interactions such as voice-controlled smart home commands - DeepSeek-Coder 1.3B and Gemma 2B trade some speed for improved language understanding, which can be useful for complex tasks or context-aware prompts @@ -69,4 +70,4 @@ Additionally, the platform must meet the following requirements: - Python 3.8 or newer - Ability to run [Ollama](https://ollama.com/) -Continue to the next section to start building a smart home system that highlights how Arm-based processors enable efficient, responsive, and private AI applications at the edge. +In the next section, you’ll set up the software dependencies needed to start building your privacy-first smart home system on Raspberry Pi 5. diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md index a9799cfe48..19254a07af 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md @@ -1,11 +1,15 @@ --- -title: Set up software dependencies +title: Set up software dependencies on Raspberry Pi 5 for Ollama and LLMs weight: 3 ### FIXED, DO NOT MODIFY layout: learningpathall --- +## Overview + +In this section, you’ll prepare your Raspberry Pi 5 by installing Python, required libraries, and Ollama, so you can run large language models (LLMs) locally. + {{% notice Note %}} This guide assumes you have set up your Raspberry Pi with Raspberry Pi OS and network connectivity. For Raspberry Pi 5 setup help, see: [Raspberry Pi Getting Started](https://www.raspberrypi.com/documentation/) {{% /notice %}} @@ -26,7 +30,7 @@ Replace `` with your Pi's username (typically `pi`), and `` with yo ssh @ ``` -## Set up dependencies +## Install Python and system dependencies Create a directory called `smart-home` in your home directory and navigate into it: @@ -77,7 +81,7 @@ If installation was successful, the output should be similar to: ollama version is 0.11.4 ``` -## Download and test a language model +## Run a test LLM with Ollama on Raspberry Pi 5 Ollama supports various models. This guide uses `deepseek-r1:7b` as an example, but you can also use `tinyllama:1.1b`, `qwen:0.5b`, `gemma2:2b`, or `deepseek-coder:1.3b`. diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md index e6248ccac3..f4f136c963 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md @@ -1,5 +1,5 @@ --- -title: Test GPIO pins +title: Test Raspberry Pi 5 GPIO pins for smart home devices weight: 4 ### FIXED, DO NOT MODIFY @@ -8,7 +8,7 @@ layout: learningpathall The next step is to test the GPIO functionality. In this section, you configure an LED light to simulate a smart home device. -## Verify GPIO functionality +## Verify GPIO setup on Raspberry Pi 5 Gather your electronic components. Connect the anode (long leg) of an LED in series with a 220Ω resistor to GPIO 17 (physical pin 11). Connect the cathode (short leg) to a ground (GND) pin. See the image below for the full setup: diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md index bcc5a40864..be9aecfdea 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md @@ -1,11 +1,11 @@ --- -title: Smart Home Assistant +title: Build and Run a Smart Home Assistant on Raspberry Pi 5 with LLMs weight: 5 ### FIXED, DO NOT MODIFY layout: learningpathall --- -## About the assistant +## Understand the Smart Home Assistant In this section, you will run the assistant through the `smart_home_assistant.py` script. It initializes all configured smart devices on specific GPIO pins and starts a local web server for interacting with the assistant. The script processes user commands using a local language model (via Ollama), parses the model’s JSON output, and executes actions such as toggling lights or locking doors. It supports both terminal and web-based control. @@ -16,7 +16,7 @@ git clone https://github.com/fidel-makatia/EdgeAI_Raspi5.git cd EdgeAI_Raspi5 ``` -## Connect further hardware +## Connect additional smart home hardware on Raspberry Pi GPIO pins In the previous section, you configured a LED on GPIO pin 17. The smart home assistant is by default associating this with a `living_room_light` device. The single LED setup is enough to run through this Learning Path. If you'd like to connect actual devices, or play with more mock sensors, the default configuration looks like the table below. You can repeat the steps on the previous page to verify the hardware setup on the different GPIO pins. See the image below for an example. diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md index 4da40caa9a..f69a711668 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md @@ -3,7 +3,7 @@ title: Build a Privacy-First LLM Smart Home on Raspberry Pi 5 minutes_to_complete: 45 -who_is_this_for: This is an introductory topic for developers interested in building smart home systems using on-device LLMs and Arm-based edge platforms like the Raspberry Pi 5. +who_is_this_for: This is an introductory topic for edge AI developers, Raspberry Pi hobbyists, and software engineers who want to build privacy-first smart home assistants. You’ll learn how to run large language models (LLMs) locally on the Raspberry Pi 5 using Ollama, control GPIO-connected devices, and deploy a web-based assistant without relying on cloud services. learning_objectives: - Understand how the Arm architecture enables efficient, private, and responsive LLM inference @@ -14,8 +14,8 @@ learning_objectives: prerequisites: - An Arm-based single board computer (for example, Raspberry Pi 5 running Raspberry Pi OS) - - Basic electronic components - breadboard, LEDs, resistors, and jumper wires - - Basic understanding of Python, GPIO pins, and electronics + - Electronic components (breadboard, LEDs, resistors, jumper wires) for GPIO testing + - Familiarity with Python programming, Raspberry Pi GPIO pinout, and basic electronics author: Fidel Makatia Omusilibwa From 4ee431f3dd359e0757e74e490530b234f7790a3a Mon Sep 17 00:00:00 2001 From: Maddy Underwood <167196745+madeline-underwood@users.noreply.github.com> Date: Tue, 19 Aug 2025 21:51:31 +0000 Subject: [PATCH 3/3] Optimized alt text for accessibility & SEO --- .../raspberry-pi-smart-home/1-overview.md | 6 +++--- .../2-software-dependencies.md | 4 ++-- .../raspberry-pi-smart-home/3-test-gpio.md | 8 ++++++-- .../4-smart-home-assistant.md | 15 ++++++++------- 4 files changed, 19 insertions(+), 14 deletions(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md index 0d105ae321..66c0786a8f 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md @@ -24,7 +24,7 @@ The Raspberry Pi 5's Arm Cortex-A76 processor can manage high-performance comput These characteristics make the Raspberry Pi 5 well suited for workloads like smart home assistants, where responsiveness, efficiency, and local processing are important. Running LLMs locally on Arm-based devices brings several practical benefits. Privacy is preserved, since conversations and routines never leave the device. With optimized inference, the system can offer responsiveness under 100 ms, even on resource-constrained hardware. It remains fully functional in offline scenarios, continuing to operate when internet access is unavailable. Developers also gain flexibility to customize models and automations. Additionally, software updates and an active ecosystem continue to improve performance over time. -## Leverage the Arm ecosystem for Raspberry Pi edge AI +## Leverage the Arm ecosystem for Raspberry Pi Edge AI For the stack in this setup, Raspberry Pi 5 benefits from the extensive developer ecosystem: @@ -46,7 +46,7 @@ The table below shows inference performance for several quantized models running | gemma2:2b | 4.1 | 23,758 | | deepseek-r1:7b | 1.6 | 64,797 | -### LLM benchmark insights on Raspberry Pi 5 +## LLM benchmark insights on Raspberry Pi 5 - Qwen 0.5B and TinyLlama 1.1B deliver fast token generation and low average latency, making them suitable for real-time interactions such as voice-controlled smart home commands - DeepSeek-Coder 1.3B and Gemma 2B trade some speed for improved language understanding, which can be useful for complex tasks or context-aware prompts @@ -56,7 +56,7 @@ The table below shows inference performance for several quantized models running This Learning Path focuses on the Raspberry Pi 5, but you can adapt the concepts and code to other Arm-powered devices. -### Recommended platforms +## Recommended platforms | Platform | CPU | RAM | GPIO support | Model size suitability | | ------------------- | -------------------------------- | -------------- | ------------------------------ | --------------------------- | diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md index 19254a07af..a054f233f3 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md @@ -11,7 +11,7 @@ layout: learningpathall In this section, you’ll prepare your Raspberry Pi 5 by installing Python, required libraries, and Ollama, so you can run large language models (LLMs) locally. {{% notice Note %}} -This guide assumes you have set up your Raspberry Pi with Raspberry Pi OS and network connectivity. For Raspberry Pi 5 setup help, see: [Raspberry Pi Getting Started](https://www.raspberrypi.com/documentation/) +This Learning Path assumes you have set up your Raspberry Pi with Raspberry Pi OS and network connectivity. For Raspberry Pi 5 setup support, see [Raspberry Pi Getting Started](https://www.raspberrypi.com/documentation/). {{% /notice %}} ## Connect to your Raspberry Pi 5 @@ -46,7 +46,7 @@ sudo apt update && sudo apt upgrade -y sudo apt install -y python3 python3-pip python3-venv git curl build-essential gcc python3-lgpio ``` -### Configure a virtual environment +## Configure a virtual environment Create and activate a Python virtual environment to isolate project dependencies: diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md index f4f136c963..e94b73229a 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md @@ -6,13 +6,17 @@ weight: 4 layout: learningpathall --- +## Overview + The next step is to test the GPIO functionality. In this section, you configure an LED light to simulate a smart home device. ## Verify GPIO setup on Raspberry Pi 5 -Gather your electronic components. Connect the anode (long leg) of an LED in series with a 220Ω resistor to GPIO 17 (physical pin 11). Connect the cathode (short leg) to a ground (GND) pin. See the image below for the full setup: +Gather your electronic components. Connect the anode (long leg) of an LED in series with a 220Ω resistor to GPIO 17 (physical pin 11). Connect the cathode (short leg) to a ground (GND) pin. + +See the image below for the full setup: -![Raspberry Pi connected to a breadboard with a green LED and jumper wires](pin_layout.jpg "Raspberry Pi connected to a breadboard with a green LED and jumper wires") +![Raspberry Pi connected to a breadboard with a green LED and jumper wires alt-text#center](pin_layout.jpg "Raspberry Pi connected to a breadboard with a green LED and jumper wires") Create a Python script named `testgpio.py`: diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md index be9aecfdea..6b5b888bcc 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md @@ -35,8 +35,9 @@ In the previous section, you configured a LED on GPIO pin 17. The smart home ass The code uses gpiozero with lgpio backend for Raspberry Pi 5 compatibility. You can use compatible output devices such as LEDs, relays, or small loads connected to these GPIO pins to represent actual smart home devices. All pin assignments are optimized for the Raspberry Pi 5's GPIO layout. {{% /notice %}} -![Raspberry Pi connected to breadboard with LEDs, buttons, and a sensor module](hardware.jpeg "Setup that includes a blue LED (mapped to Living Room Light on GPIO 17), a red LED, push button, and a sensor module. This setup illustrates a simulated smart home with controllable devices.") +![[Raspberry Pi 5 connected to a breadboard with LEDs, push button, and sensor module alt-text#center](hardware.jpeg "Setup that includes a blue LED (mapped to Living Room Light on GPIO 17), a red LED, push button, and a sensor module.") +This setup illustrates a simulated smart home with controllable devices. ## Run the Smart Home Assistant @@ -57,7 +58,7 @@ python3 smart_home_assistant.py --no-api {{< /tab >}} {{< /tabpane >}} -### Command options +## Command options | Option | Description | Example | |------------------|---------------------------------------------------------------------------------------------------|--------------------------------------------| @@ -67,20 +68,20 @@ python3 smart_home_assistant.py --no-api If everything is set up correctly, you should see the following output on running the default command: -![Running in Default Mode](cmd.png "Running the code in default mode") +![Terminal running smart_home_assistant.py showing default web API and CLI output alt-text#center](cmd.png "Running the code in default mode") ## Interact with your assistant Try asking the assistant to `turn on living room light`. If you've connected additional devices, come up with prompts to test the setup. -### Web interface +## Web interface Open your browser and navigate to `http://0.0.0.0:8000`, or as printed in the terminal output. -![Web Interface Interaction](UI3.png "Interacting with the LLM through the web interface") +![Web interface of the smart home assistant showing device control through LLM commands alt-text#center](UI3.png "Interacting with the LLM through the web interface") -### Command line interface +## Command line interface Type commands directly in the terminal. @@ -92,7 +93,7 @@ I want to watch my favorite show its getting late, secure the house ``` -![DeepSeek-Coder Interaction](gemma2.png "Interacting with deepseek-coder:1.3b") +![Terminal interaction with the smart home assistant showing LLM responses to user commands alt-text#center](gemma2.png "Interacting with deepseek-coder:1.3b") {{% notice Troubleshooting %}} If you're running into issues with the assistant, here are some things to check: