diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md new file mode 100644 index 0000000000..3fa8ad30dd --- /dev/null +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/1-overview.md @@ -0,0 +1,73 @@ +--- +title: Overview +weight: 2 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +## Overview + +This Learning Path walks you through deploying an efficient large language model (LLM) locally on the Raspberry Pi 5, powered by an Arm Cortex-A76 CPU. This will allow you to control your smart home using natural language, without relying on cloud services. With rapid advances in Generative AI and the power of Arm Cortex-A processors, you can now run advanced language models directly in your home on the Raspberry Pi 5. + +You will create a fully local, privacy-first smart home system that leverages the strengths of Arm Cortex-A architecture. The system can achieve 15+ tokens per second inference speeds using optimized models like TinyLlama and Qwen, while maintaining the energy efficiency that makes Arm processors a good fit for always-on applications. + +## Why Arm Cortex-A for Edge AI? + +The Raspberry Pi 5's Arm Cortex-A76 processor can manage high-performance computing tasks like AI inference. Key architectural features include: + +- The **superscalar architecture** allows the processor to execute multiple instructions in parallel, improving throughput for compute-heavy tasks. +- **128-bit NEON SIMD support** accelerates matrix and vector operations, which are common in the inner loops of language model inference. +- The **multi-level cache hierarchy** helps reduce memory latency and improves data access efficiency during runtime. +- The **thermal efficiency** enables sustained performance without active cooling, making it ideal for compact or always-on smart home setups. + +These characteristics make the Raspberry Pi 5 well-suited for workloads like smart home assistants, where responsiveness, efficiency, and local processing are important. Running LLMs locally on Arm-based devices brings several practical benefits. Privacy is preserved, since conversations and routines never leave the device. With optimized inference, the system can offer responsiveness under 100 ms, even on resource-constrained hardware. It remains fully functional in offline scenarios, continuing to operate when internet access is unavailable. Developers also gain flexibility to customize models and automations. Additionally, software updates and an active ecosystem continue to improve performance over time. + +## Arm Ecosystem Advantages + +For the stack in this setup, Raspberry Pi 5 benefits from the extensive developer ecosystem: + +- Optimized compilers including GCC and Clang with Arm-specific enhancements +- Native libraries such as gpiozero and lgpio are optimized for Raspberry Pi +- Community support from open-source projects where developers are contributing Arm-optimized code +- Arm maintains a strong focus on backward compatibility, which reduces friction when updating kernels or deploying across multiple Arm platforms +- The same architecture powers smartphones, embedded controllers, edge devices, and cloud infrastructure—enabling consistent development practices across domains + +## Performance Benchmarks on Raspberry Pi 5 + +The table below shows inference performance for several quantized models running on a Raspberry Pi 5. Measurements reflect single-threaded CPU inference with typical prompt lengths and temperature settings suitable for command-based interaction. + +| Model | Tokens/Sec | Avg Latency (ms) | +| ------------------- | ---------- | ---------------- | +| qwen:0.5b | 17.0 | 8,217 | +| tinyllama:1.1b | 12.3 | 9,429 | +| deepseek-coder:1.3b | 7.3 | 22,503 | +| gemma2:2b | 4.1 | 23,758 | +| deepseek-r1:7b | 1.6 | 64,797 | + + +What does this table tell us? Here are some performance insights: + +- Qwen 0.5B and TinyLlama 1.1B deliver fast token generation and low average latency, making them suitable for real-time interactions like voice-controlled smart home commands. +- DeepSeek-Coder 1.3B and Gemma 2B trade off some speed for improved language understanding, which can be useful for more complex task execution or context-aware prompts. +- DeepSeek-R1 7B offers advanced reasoning capabilities with acceptable latency, which may be viable for offline summarization, planning, or low-frequency tasks. + +## Supported Arm-Powered Devices + +This Learning Path focuses on the Raspberry Pi 5, but you can adapt the concepts and code to other Arm-powered devices: + +### Recommended Platforms + +| Platform | CPU | RAM | GPIO Support | Model Size Suitability | +|------------------|----------------------------------|----------------|-------------------------------|-----------------------------| +| **Raspberry Pi 5** | Arm Cortex-A76 quad-core @ 2.4GHz | Up to 16GB | Native `lgpio` (high-performance) | Large models (8–16GB) | +| **Raspberry Pi 4** | Arm Cortex-A72 quad-core @ 1.8GHz | Up to 8GB | Compatible with `gpiozero` | Small to mid-size models | +| **Other Arm Devices** | Arm Cortex-A | 4GB min (8GB+ recommended) | Requires physical GPIO pins | Varies by RAM | + +Additionally, the platform must: + +- GPIO pins available for hardware control +- Use Python 3.8 or newer +- Ability to run [Ollama](https://ollama.com/) + +Continue to the next section to start building a smart home system that highlights how Arm-based processors can enable efficient, responsive, and private AI applications at the edge. \ No newline at end of file diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md new file mode 100644 index 0000000000..ab97b85829 --- /dev/null +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/2-software-dependencies.md @@ -0,0 +1,96 @@ +--- +title: Set up software dependencies +weight: 3 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +{{% notice Note %}} +This guide assumes you have set up your Raspberry Pi with Raspberry Pi OS and network connectivity. For Raspberry Pi 5 setup help, see: [Raspberry Pi Getting Started](https://www.raspberrypi.com/documentation/) +{{% /notice %}} + +## Connect to Your Raspberry Pi 5 + +### Option 1: Using a display + +The easiest way to work on your Raspberry Pi is connecting it to an external display through one of the micro HDMI ports. This setup also requires a keyboard and mouse to navigate. + +### Option 2: Using SSH + +You can also use SSH to access the terminal. To use this approach you need to know the IP address of your device. Ensure your Raspberry Pi 5 connects to the same network as your host computer. Access your device remotely via SSH using the terminal or any SSH client. + +Replace `` with your Pi's username (typically `pi`), and `` with your Raspberry Pi 5's IP address. + +```bash +ssh @ +``` + +## Set up the dependencies + +Create a directory called `smart-home` in your home directory and navigate into it: + +```bash +mkdir $HOME/smart-home +cd $HOME/smart-home +``` + +The Raspberry Pi 5 includes Python 3 pre-installed, but you need additional packages: + +```bash +sudo apt update && sudo apt upgrade +sudo apt install python3 python3-pip python3-venv git curl build-essential gcc python3-lgpio +``` + +### Configure the virtual environment + +The next step is to create and activate a Python virtual environment. This approach keeps project dependencies isolated and prevents conflicts with system-wide packages: + +```bash +python3 -m venv venv +source venv/bin/activate +``` + +Install all required libraries and dependencies: + +```bash +pip install ollama gpiozero lgpio psutil httpx orjson numpy fastapi uvicorn uvloop numpy +``` + +### Install Ollama + +Install Ollama using the official installation script for Linux: + +```bash +curl -fsSL https://ollama.com/install.sh | sh +``` + +Verify the installation: + +```bash +ollama --version +``` +If installation was successful, the output from the command should match that below. +```output +ollama version is 0.11.4 +``` + +## Download and Test a Language Model + +Ollama supports various models. This guide uses deepseek-r1:7b as an example, but you can also use `tinyllama:1.1b`, `qwen:0.5b`, `gemma2:2b`, or `deepseek-coder:1.3b`. + +The `run` command will set up the model automatically. You will see download progress in the terminal, followed by the interactive prompt when ready. + +```bash +ollama run deepseek-r1:7b +``` + +{{% notice Troubleshooting %}} +If you run into issues with the model download, here are some things to check: + +- Confirm internet access and sufficient storage space on your microSD card +- Try downloading smaller models like `qwen:0.5b` or `tinyllama:1.1b` if you encounter memory issues. 16 GB of RAM is sufficient for running smaller to medium-sized language models. Very large models may require more memory or run slower. +- Clear storage or connect to a more stable network if errors occur +{{% /notice %}} + +With the model set up through `ollama`, move on to the next section to start configuring the hardware. \ No newline at end of file diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md new file mode 100644 index 0000000000..4d7efe7a2e --- /dev/null +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/3-test-gpio.md @@ -0,0 +1,70 @@ +--- +title: Test GPIO pins +weight: 4 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +The next step is to test the GPIO functionality. In this section, you will configure a LED light to simulate a smart-home device. + +## Verify GPIO Functionality + +Bring out your electronics components. Connect the anode (long leg) of an LED in series with a 220Ω resistor to GPIO 17 (physical pin 11). Connect the cathode (short leg) to a ground (GND) pin. See image below for the full setup: + +![Raspberry Pi connected to a breadboard with a green LED and jumper wires](pin_layout.jpg "Raspberry Pi connected to a breadboard with a green LED and jumper wires") + +Create a Python script named `testgpio.py`: + +```bash +cd $HOME/smart-home +vim testgpio.py +``` + +Copy this code into the file: + +```python +#!/usr/bin/env python3 +import time +from gpiozero import Device, LED +from gpiozero.pins.lgpio import LGPIOFactory + +# Set lgpio backend for Raspberry Pi 5 +Device.pin_factory = LGPIOFactory() + +# Setup GPIO pin 17 +pin1 = LED(17) + +try: + while True: + pin1.toggle() # Switch pin 17 state + time.sleep(2) # Wait 2 seconds +except KeyboardInterrupt: # Ctrl+C pressed + pin1.close() # Clean up pin 17 +``` + +Run the script: + +```bash +python testgpio.py +``` + +The LED should blink every two seconds. If you observe this behavior, your GPIO setup works correctly. + +{{% notice Troubleshooting %}} +If you run into issues with the hardware setup, here are some things to check: +- Try fixing missing dependencies by running the following command: +```bash +sudo apt-get install -f +``` +- If you're running into GPIO permission issues, run Python scripts with `sudo` or add your user to the `gpio` group. Don't forget to log out for the changes to take effect. +```bash +sudo usermod -a -G gpio $USER +``` +- Double-check wiring and pin numbers using the Raspberry Pi 5 pinout diagram +- Ensure proper LED and resistor connections +- Verify GPIO enablement in `raspi-config` if needed +- Use a high-quality power supply +{{% /notice %}} + +With a way to control devices using GPIO pins, you can move on to the next section to interact with them using language models and the user interface. \ No newline at end of file diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md new file mode 100644 index 0000000000..89fac4bc34 --- /dev/null +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/4-smart-home-assistant.md @@ -0,0 +1,113 @@ +--- +title: Smart Home Assistant +weight: 5 + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- +## About the assistant + +In this section, you will run the assistant through the `smart_home_assistant.py` script. It initializes all configured smart devices on specific GPIO pins and starts a local web server for interacting with the assistant. The script processes user commands using a local language model (via Ollama), parses the model’s JSON output, and executes actions such as toggling lights or locking doors. It supports both terminal and web-based control. + +The assistant is available on GitHub. Clone the code and navigate to the project directory: + +```bash +git clone https://github.com/fidel-makatia/EdgeAI_Raspi5.git +cd EdgeAI_Raspi5 +``` + +## Connect further hardware + +In the previous section, you configured a LED on GPIO pin 17. The smart home assistant is by default associating this with a `living_room_light` device. The single LED setup is enough to run through this Learning Path. If you'd like to connect actual devices, or play with more mock sensors, the default configuration looks like the table below. You can repeat the steps on the previous page to verify the hardware setup on the different GPIO pins. See the image below for an example. + +| Device Name | GPIO Pin | Type | Room | +| ----------------- | -------- | --------- | ----------- | +| living_room_light | 17 | LIGHT | living_room | +| living_room_fan | 27 | FAN | living_room | +| smart_tv | 22 | SMART_TV | living_room | +| bedroom_light | 23 | LIGHT | bedroom | +| bedroom_ac | 24 | AC | bedroom | +| kitchen_light | 5 | LIGHT | kitchen | +| front_door_lock | 26 | DOOR_LOCK | entrance | +| garden_light | 16 | LIGHT | outdoor | + +{{% notice Note %}} +The code uses gpiozero with lgpio backend for Raspberry Pi 5 compatibility. You can use compatible output devices such as LEDs, relays, or small loads connected to these GPIO pins to represent actual smart home devices. All pin assignments are optimized for the Raspberry Pi 5's GPIO layout. +{{% /notice %}} + +![Raspberry Pi connected to breadboard with LEDs, buttons, and a sensor module](hardware.jpeg "Setup that includes a blue LED (mapped to Living Room Light on GPIO 17), a red LED, push button, and a sensor module. This setup illustrates a simulated smart home with controllable devices.") + + +## Run the Smart Home Assistant + +Run the assistant in different modes depending on your use case. The default model is `deepseek-coder:1.3b`: + +{{< tabpane code=true >}} +{{< tab header="Default (Web API + CLI)" language="bash">}} +python3 smart_home_assistant.py +{{< /tab >}} +{{< tab header="Specify model" language="bash">}} +python3 smart_home_assistant.py --model qwen:0.5b +{{< /tab >}} +{{< tab header="Custom web port" language="bash">}} +python3 smart_home_assistant.py --port 8080 +{{< /tab >}} +{{< tab header="CLI only" language="bash">}} +python3 smart_home_assistant.py --no-api +{{< /tab >}} +{{< /tabpane >}} + +### Command Options + +| Option | Description | Example | +|------------------|---------------------------------------------------------------------------------------------------|--------------------------------------------| +| `--model` | Specify the model to use with Ollama | `--model tinyllama:1.1b` | +| `--port` | Run the web server on a custom port (default: `8000`) | `--port 8080` | +| `--no-api` | Disable the web API and run in CLI-only mode + +If everything is set up correctly, you should see the following output on running the default command: + +![Running in Default Mode](cmd.png "Running the code in default mode") + +## Interact With Your Assistant + +Try asking the assistant to `turn on living room light`. If you've connected additional devices, come up with prompts to test the setup. + +### Web interface + + Open your browser and navigate to `http://0.0.0.0:8000`, or as printed in the terminal output. + + ![Web Interface Interaction](UI3.png "Interacting with the LLM through the web interface") + + +### Command line interface + +Type commands directly in the terminal. + +Sample commands: + +```bash +turn on living room light +I want to watch my favorite show +its getting late, secure the house +``` + +![DeepSeek-Coder Interaction](gemma2.png "Interacting with deepseek-coder:1.3b") + +{{% notice Troubleshooting %}} +If you're running into issues with the assistant, here are some things to check: +- Make sure your virtual environment is activated and that you installed all the packages from previous sections +- For model loading problems, check if Ollama is running and list available models: + ```bash + ollama list + ollama serve + ``` +- If port 8000 is unavailable, run the assistant with a different port using the `--port` flag. +{{% /notice %}} + +## Wrapping up + +From here, you can modify the `smart_home_assistant.py` and extend the system by adding more devices, experimenting with conversational commands, or integrating sensors and automation logic into your smart home setup. + +You should now know more about setting up a Raspberry Pi 5 to control real-world devices using GPIO pins, and running a smart home assistant powered by local language models through Ollama. You’ve learned how to wire basic circuits with LEDs and resistors to simulate smart devices, and how to launch and interact with the assistant through both the command-line interface and a web dashboard. Along the way, you also explored common troubleshooting steps for GPIO access, missing dependencies, and model loading issues. + diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md index 0c725a2d0a..2634080a6e 100644 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md +++ b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/_index.md @@ -3,20 +3,18 @@ title: Build a Privacy-First LLM Smart Home on Raspberry Pi 5 minutes_to_complete: 45 -who_is_this_for: Anyone who wants a private, cloud-free smart home powered by GenAI on Arm +who_is_this_for: This is an introductory topic for developers interested in building smart home systems using on-device LLMs and Arm-based edge platforms like the Raspberry Pi 5. learning_objectives: - - "Deploy a local Large Language Model (LLM) for smart home control" - - "Integrate natural language processing with GPIO control" - - "Build and run everything on Arm-based single-board computers (no cloud required)" - - "Optimize for speed, privacy, and offline operation" - - "Create an interactive web dashboard for smart home control" + - "Understand how the Arm architecture enables efficient, private, and responsive LLM inference" + - "Run a smart home assistant on Raspberry Pi 5 with local LLM integration" + - "Wire and control physical devices (e.g., LEDs) using Raspberry Pi GPIO pins" + - "Deploy and interact with a local language model using Ollama" + - "Launch and access a web-based dashboard for device control" prerequisites: - - "Basic Python knowledge" - - "A text editor (e.g., VS Code, Sublime, Notepad++)" - - "An Arm-based single board computer (e.g., Raspberry Pi 5 with at least 8GB RAM)" - - "Basic electronic components such as LEDs, sensors, and relays" - - "Basic understanding of GPIO pins and electronics" + - "An Arm-based single board computer (e.g., Raspberry Pi 5 running Raspberry Pi OS)" + - "Basic electronic components: breadboard, LEDs, resistors and jumper wires" + - "Basic understanding of Python, GPIO pins and electronics" author: "Fidel Makatia Omusilibwa" @@ -24,17 +22,16 @@ author: "Fidel Makatia Omusilibwa" skilllevels: "Introductory" subjects: "ML" armips: - - "Arm Cortex A" + - "Cortex-A" tools_software_languages: - "Python" - "Ollama" - "gpiozero" - "lgpio" - "FastAPI" - - "VS Code or your preferred code editor" - - "Raspberry Pi OS (64-bit)" + - "Raspberry Pi" operatingsystems: - - "Windows , Linux, MacOS" + - "Linux" further_reading: - resource: diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/benchmarks.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/benchmarks.png deleted file mode 100644 index 8d185db4f8..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/benchmarks.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/cmd_stats.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/cmd_stats.png deleted file mode 100644 index e3f076dab7..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/cmd_stats.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/deepseek.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/deepseek.png deleted file mode 100644 index 1709229026..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/deepseek.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/example-picture.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/example-picture.png deleted file mode 100644 index c69844bed4..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/example-picture.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/hardware2.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/hardware2.png deleted file mode 100644 index 7f05d20089..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/hardware2.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-1.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-1.md deleted file mode 100644 index 067a273bd1..0000000000 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-1.md +++ /dev/null @@ -1,99 +0,0 @@ ---- -title: Overview -weight: 2 - -### FIXED, DO NOT MODIFY -layout: learningpathall ---- - -## Overview - -Control your smart home using natural language with no cloud connection, no third-party servers, and no compromises on privacy. With rapid advances in Generative AI and the power of Arm Cortex-A processors, you can now run large language models (LLMs) directly in your home on the Raspberry Pi 5. - -You will create a fully local, privacy-first smart home system that leverages the strengths of Arm Cortex-A architecture. The system achieves 15+ tokens per second inference speeds using optimized models like TinyLlama and Qwen, while maintaining the energy efficiency that makes Arm processors ideal for always-on applications. - -## Why Arm Cortex-A for Edge AI? - -The Raspberry Pi 5's Arm Cortex-A76 processor excels at high-performance computing tasks like AI inference through: - -- Superscalar architecture that executes multiple instructions simultaneously -- Advanced SIMD with 128-bit NEON units for matrix operations -- Multi-level cache hierarchy that reduces memory latency -- Thermal efficiency that maintains performance in compact form factors - -Your Arm-powered smart home processes everything locally, providing: - -- **Total Privacy**: Conversations and routines never leave your device -- **Lightning Speed**: Sub-100ms response times with optimized processing -- **Rock-Solid Reliability**: Operation continues when internet connectivity fails -- **Unlimited Customization**: Complete control over AI models and automations -- **Future-Proof Performance**: Continued optimization through Arm's roadmap - -## Performance Benchmarks on Raspberry Pi 5 - -| Model | Tokens/Sec | Avg Latency (ms) | Performance Rating | -| ------------------- | ---------- | ---------------- | -------------------- | -| qwen:0.5b | 17.0 | 8,217 | ⭐⭐⭐⭐⭐ Excellent | -| tinyllama:1.1b | 12.3 | 9,429 | ⭐⭐⭐⭐⭐ Excellent | -| deepseek-coder:1.3b | 7.3 | 22,503 | ⭐⭐⭐⭐ Very Good | -| gemma2:2b | 4.1 | 23,758 | ⭐⭐⭐⭐ Very Good | -| deepseek-r1:7b | 1.6 | 64,797 | ⭐⭐⭐ Good | - -Performance insights: - -- Qwen 0.5B and TinyLlama 1.1B provide optimal speed for real-time smart home commands -- DeepSeek-Coder 1.3B and Gemma2 2B handle complex automation tasks effectively -- DeepSeek-R1 7B offers advanced reasoning capabilities with acceptable latency - -## Arm Ecosystem Advantages - -The Raspberry Pi 5 benefits from the extensive Arm developer ecosystem: - -- Optimized compilers including GCC and Clang with Arm-specific enhancements -- Native libraries such as gpiozero and lgpio optimized for Raspberry Pi -- Community support from millions of developers contributing Arm-optimized code -- Long-term support through Arm's commitment to backward compatibility -- Industrial adoption with the same architecture powering smartphones, servers, and embedded systems - -## Supported Arm-Powered Devices - -This learning path focuses on the Raspberry Pi 5, but you can adapt the concepts and code to other Arm-powered devices: - -### Recommended Platforms - -**Raspberry Pi 5 (Primary Focus)** - -- Arm Cortex-A76 quad-core @ 2.4GHz -- Up to 16GB RAM for larger models -- Native lgpio support with optimized GPIO performance - -**Raspberry Pi 4** - -- Arm Cortex-A72 quad-core @ 1.8GHz -- 8GB RAM maximum, suitable for smaller models -- Proven compatibility with gpiozero ecosystem - -### Compatibility Requirements - -Any Arm device can potentially run this project with: - -- Arm Cortex-A processor -- Minimum 4GB RAM (8GB+ recommended) -- GPIO pins for hardware control -- Python 3.8+ support -- Ability to run Ollama - -If your Arm device supports Linux, Python, and has GPIO capabilities, you can adapt this learning path to your specific hardware. - -## What You Will Build - -By completing this learning path, your Raspberry Pi 5 will run: - -- Ultra-fast AI processing with 15+ tokens/second performance -- Complete GPIO control for lights, fans, locks, and sensors via gpiozero + lgpio -- Modern web dashboard with FastAPI-powered interface optimized for mobile -- NEON-accelerated performance using custom ARM assembly for critical paths -- Zero-cloud architecture with everything running locally on your Arm processor -- Intelligent automation with scene-based control using natural language - -You will build a smart home system that demonstrates why Arm processors represent the future of edge computing, combining efficiency, performance, and complete privacy control. diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-2.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-2.md deleted file mode 100644 index d3392e7b86..0000000000 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-2.md +++ /dev/null @@ -1,167 +0,0 @@ ---- -title: Installation Guide -weight: 3 - -### FIXED, DO NOT MODIFY -layout: learningpathall ---- - -# Installation Guide - -{{% notice Note %}} -This guide assumes you have set up your Raspberry Pi with Raspberry Pi OS and network connectivity. For Raspberry Pi 5 setup help, see: [Raspberry Pi Getting Started](https://www.raspberrypi.com/documentation/) -{{% /notice %}} - -## Connect to Your Raspberry Pi 5 via SSH - -Ensure your Raspberry Pi 5 connects to the same network as your host computer. Access your device remotely via SSH using the terminal in Visual Studio Code, the built-in terminal, Command Prompt, or any other SSH client. - -Replace `` with your Pi's username (typically `pi`), and `` with your Raspberry Pi 5's IP address. - -```bash -ssh @ -``` - -Create a directory called smart-home in your home directory and navigate into it: - -```bash -mkdir ~/smart-home -cd ~/smart-home -``` - -## Prepare the System - -Update your system and install necessary packages. The Raspberry Pi 5 includes Python 3 pre-installed, but you need additional packages: - -```bash -sudo apt update && sudo apt upgrade -sudo apt install python3 python3-pip python3-venv git curl build-essential gcc python3-lgpio -``` - -## Install Python Dependencies - -Create and activate a Python virtual environment. This approach keeps project dependencies isolated and prevents conflicts with system-wide packages: - -```bash -python3 -m venv venv -source venv/bin/activate -``` - -Install all required libraries and dependencies: - -```bash -pip install ollama gpiozero lgpio psutil httpx orjson numpy fastapi uvicorn uvloop -``` - -You can run the pip install commands without creating a virtual environment, but using a virtual environment is recommended for development workflows. - -## Install Ollama - -Install Ollama using the official installation script: - -```bash -curl -fsSL https://ollama.com/install.sh | sh -``` - -Verify the installation: - -```bash -ollama --version -``` - -## Download and Test a Language Model - -Ollama supports various models. This guide uses deepseek-r1:7b as an example, but you can also use tinyllama:1.1b, qwen:0.5b, gemma2:2b, or deepseek-coder:1.3b. - -Pull and run deepseek-r1:7b: - -```bash -ollama run deepseek-r1:7b -``` - -Ollama automatically downloads deepseek-r1:7b before running it. You will see download progress in the terminal, followed by the interactive prompt when ready. - -{{% notice Note %}} -The Raspberry Pi 5 supports up to 16GB of RAM, which is sufficient for running smaller to medium-sized language models. Very large models may require more memory or run slower. -{{% /notice %}} - -## Verify GPIO Functionality - -Test GPIO functionality by connecting an LED. Connect the anode (long leg) of an LED in series with a 220Ω resistor to GPIO 17 (physical pin 11). Connect the cathode (short leg) to a ground (GND) pin. - -Create a Python script named `testgpio.py`: - -```bash -nano testgpio.py -``` - -Copy this code into the file: - -```python -#!/usr/bin/env python3 -import time -from gpiozero import Device, LED -from gpiozero.pins.lgpio import LGPIOFactory - -# Set lgpio backend for Raspberry Pi 5 -Device.pin_factory = LGPIOFactory() - -# Setup GPIO pin 17 -pin1 = LED(17) - -try: - while True: - pin1.toggle() # Switch pin 17 state - time.sleep(2) # Wait 2 seconds -except KeyboardInterrupt: # Ctrl+C pressed - pin1.close() # Clean up pin 17 -``` - -Run the script: - -```bash -python testgpio.py -``` - -The LED should blink every two seconds. If you observe this behavior, your GPIO setup works correctly. - -## Raspberry Pi 5 Specific Features - -**GPIO Compatibility**: The Raspberry Pi 5 maintains GPIO compatibility with previous models. Existing GPIO code works without modification. - -**Performance**: The Raspberry Pi 5 features a quad-core Arm Cortex-A76 CPU running at 2.4 GHz. This processor provides significantly improved performance for AI workloads compared to previous Raspberry Pi models. The Arm Cortex-A76 cores support NEON SIMD (Single Instruction, Multiple Data) extensions, enabling efficient parallel processing and accelerating compute-intensive tasks such as machine learning inference and signal processing. - -**Power Requirements**: Use the official Raspberry Pi 5 power supply (5V/5A USB-C) for optimal performance when running AI models. - -## Troubleshooting - -**Missing dependencies** - -Resolve dependency issues: - -```bash -sudo apt-get install -f -``` - -**GPIO permission errors** - -Run Python scripts with sudo or add your user to the gpio group: - -```bash -sudo usermod -a -G gpio $USER -``` - -Log out and back in for changes to take effect. - -**Model download issues** - -- Confirm internet access and sufficient storage space on your microSD card -- Check model size before downloading to ensure adequate space -- Try downloading smaller models like `qwen:0.5b` or `tinyllama:1.1b` if you encounter memory issues -- Clear storage or connect to a more stable network if errors occur - -**Hardware not responding** - -- Double-check wiring and pin numbers using the Raspberry Pi 5 pinout diagram -- Ensure proper LED and resistor connections -- Verify GPIO enablement in `raspi-config` if needed diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-3.md b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-3.md deleted file mode 100644 index 67d1532f8f..0000000000 --- a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/how-to-3.md +++ /dev/null @@ -1,169 +0,0 @@ ---- -title: Smart Home Assistant -weight: 4 - -### FIXED, DO NOT MODIFY -layout: learningpathall ---- - -# Run the Smart Home Assistant on Raspberry Pi 5 - -Set up and launch your smart home GenAI assistant on your Raspberry Pi 5. This code has been specifically adapted for the Raspberry Pi 5 with 16GB RAM. - -{{% notice Note %}} -This guide assumes you have installed Python, all required dependencies, and Ollama as described in the previous steps. -{{% /notice %}} - -## Clone the Repository - -Deactivate your virtual environment if you created one earlier: - -```bash -deactivate -``` - -Clone the repository and navigate to the project directory: - -```bash -git clone https://github.com/fidel-makatia/EdgeAI_Raspi5.git -cd EdgeAI_Raspi5 -``` - -## Activate Your Virtual Environment (Optional) - -If you created a Python virtual environment earlier, activate it: - -```bash -source venv/bin/activate -``` - -## Connect Your Hardware - -Configure the following GPIO pin assignments for the Raspberry Pi 5. You do not need to connect all pins for initial testing. Using LEDs to simulate each device is acceptable and simplifies setup. - -| Device Name | GPIO Pin | Type | Room | -| ----------------- | -------- | --------- | ----------- | -| living_room_light | 17 | LIGHT | living_room | -| living_room_fan | 27 | FAN | living_room | -| smart_tv | 22 | SMART_TV | living_room | -| bedroom_light | 23 | LIGHT | bedroom | -| bedroom_ac | 24 | AC | bedroom | -| kitchen_light | 5 | LIGHT | kitchen | -| front_door_lock | 26 | DOOR_LOCK | entrance | -| garden_light | 16 | LIGHT | outdoor | - -{{% notice Note %}} -The code uses gpiozero with lgpio backend for Raspberry Pi 5 compatibility. You can use compatible output devices such as LEDs, relays, or small loads connected to these GPIO pins to represent actual smart home devices. All pin assignments are optimized for the Raspberry Pi 5's GPIO layout. -{{% /notice %}} - -![Hardware Setup](hardware.jpeg "Figure 1. Hardware Setup") - -## Run the Smart Home Assistant - -Run the assistant in different modes depending on your use case. The default model is deepseek-coder:1.3b: - -{{< tabpane code=true >}} -{{< tab header="Default (Web API + CLI)" language="bash">}} -python3 smart_home_assistant.py -{{< /tab >}} -{{< tab header="Specify Model" language="bash">}} -python3 smart_home_assistant.py --model qwen:0.5b -{{< /tab >}} -{{< tab header="Custom Web Port" language="bash">}} -python3 smart_home_assistant.py --port 8080 -{{< /tab >}} -{{< tab header="CLI Only" language="bash">}} -python3 smart_home_assistant.py --no-api -{{< /tab >}} -{{< /tabpane >}} - -**Command Options** - -**Default:** Runs the application using the default model and starts both web server and CLI. - -**Specify Model:** Use --model to select a specific model (e.g., tinyllama:1.1b, gemma2:9b, deepseek-coder:1.3b). - -**Custom Web Port:** Use --port to run the web server on a different port (default is 8000). - -**CLI Only:** Use --no-api to disable the web API and use only the command-line interface. - -![Running in Default Mode](cmd.png "Figure 2. Running the code in default mode") - -## Interact With Your Assistant - -**Web Interface:** Open your browser and navigate to **http://your-raspi5-ip:5000** (or your chosen port). - -**CLI Mode:** Type commands directly in the terminal. - -Sample commands: - -```bash -turn on living room light -I want to watch my favorite show -its getting late, secure the house -``` - -![Web Interface Interaction](UI3.png "Figure 3. Interacting with the LLM through the web interface") - -![DeepSeek-Coder Interaction](gemma2.png "Figure 4. Interacting with deepseek-coder:1.3b") - -## Model Performance Benchmarks on Raspberry Pi 5 16GB RAM - -![Performance Benchmarks](benchmarks.png "Figure 5. Model Performance Benchmarks on Raspberry Pi 5 16GB RAM") - -| Model | Tokens/Sec | Avg Latency (ms) | Performance Rating | -| ------------------- | ---------- | ---------------- | -------------------- | -| qwen:0.5b | 17.0 | 8,217 | ⭐⭐⭐⭐⭐ Excellent | -| tinyllama:1.1b | 12.3 | 9,429 | ⭐⭐⭐⭐⭐ Excellent | -| deepseek-coder:1.3b | 7.3 | 22,503 | ⭐⭐⭐⭐ Very Good | -| gemma2:2b | 4.1 | 23,758 | ⭐⭐⭐⭐ Very Good | -| deepseek-r1:7b | 1.6 | 64,797 | ⭐⭐⭐ Good | - -**Performance Insights:** - -- Qwen 0.5B and TinyLlama 1.1B provide optimal performance for real-time smart home commands -- DeepSeek-Coder 1.3B and Gemma2 2B handle complex automation tasks effectively -- DeepSeek-R1 7B offers advanced reasoning capabilities with acceptable latency - -The 16GB RAM allows smooth operation of all models while maintaining system responsiveness. - -## Raspberry Pi 5 Specific Features - -**16GB RAM Support:** Takes advantage of the full 16GB RAM for enhanced model performance - -**lgpio Backend:** Uses the modern lgpio library for reliable GPIO control - -**Optimized Performance:** Code optimized for the Raspberry Pi 5's ARM Cortex-A76 quad-core processor - -**Enhanced Stability:** Improved GPIO handling prevents conflicts with other system processes - -## Troubleshooting - -**Missing packages:** Ensure you activated your virtual environment and installed requirements: - -```bash -pip install gpiozero lgpio ollama -``` - -**GPIO errors:** Verify lgpio backend installation: - -```bash -sudo apt update -sudo apt install python3-lgpio -``` - -**Model loading issues:** Check Ollama status and available models: - -```bash -ollama list -ollama serve -``` - -**Port conflicts:** Specify a different port with --port if the default port is in use. - -**GPIO permission issues:** Add your user to the gpio group: - -```bash -sudo usermod -a -G gpio $USER -# Log out and log in again for changes to take effect -``` diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/pin_layout.jpg b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/pin_layout.jpg new file mode 100644 index 0000000000..edf61c88fb Binary files /dev/null and b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/pin_layout.jpg differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/sec1.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/sec1.png deleted file mode 100644 index 614250a5cf..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/sec1.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/security.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/security.png deleted file mode 100644 index a78f463b29..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/security.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/security2.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/security2.png deleted file mode 100644 index 69b7396965..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/security2.png and /dev/null differ diff --git a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/stats3.png b/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/stats3.png deleted file mode 100644 index 9270ca415b..0000000000 Binary files a/content/learning-paths/embedded-and-microcontrollers/raspberry-pi-smart-home/stats3.png and /dev/null differ