From 42cf265e22422e38693035eaf0b1b73fafa52e28 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Fri, 15 Nov 2024 17:29:28 +0100 Subject: [PATCH 1/7] Update YOLOv8 on Himax Learning Path --- .../microcontrollers/yolo-on-himax/_index.md | 32 ++++--- .../microcontrollers/yolo-on-himax/_review.md | 24 +++++- .../yolo-on-himax/how-to-1.md | 85 ++++++++++++------- .../yolo-on-himax/how-to-2.md | 42 +++++---- .../yolo-on-himax/how-to-3.md | 34 ++++---- .../yolo-on-himax/how-to-4.md | 27 ++++-- .../yolo-on-himax/how-to-5.md | 26 +++--- 7 files changed, 163 insertions(+), 107 deletions(-) diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md index 59d523d0b6..4a5167d254 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md @@ -3,33 +3,37 @@ title: Run a Computer Vision Model on a Himax Microcontroller minutes_to_complete: 90 -who_is_this_for: This is an introduction topic for beginners on how to run a computervision application on an embedded device from Himax. This example uses an off-the-shelf Himax WiseEye2 module which is based on the Arm Cortex-M55 and Ethos-U55. +who_is_this_for: This is an introduction topic for beginners on how to run a computer vision application on an embedded device from Himax. This example uses an off-the-shelf Himax WiseEye2 module which is based on the Arm Cortex-M55 and Ethos-U55. + +learning_objectives: + - Run a you-only-look-once (YOLO) computer vision model on the edge device + - Build the Himax Software Development Kit (SDK) and generate the firmware image file + - Update the firmware on the edge device (Himax WiseEye2) -learning_objectives: - - Run a you-only-look-once (YOLO) computer vision model using off-the-shelf hardware based on the Arm Cortex-M55 and Ethos-U55. - - Learn how to build the Himax SDK and generate firmware image file. - - Learn how to update firmware on edge device (Himax WiseEye2). - prerequisites: - - Seeed Grove Vision AI V2 Module - - OV5647-62 Camera module and included FPC cable + - A [Seeed Grove Vision AI Module V2](https://www.seeedstudio.com/Grove-Vision-AI-Module-V2-p-5851.html) development board + - A [OV5647-62 Camera Module](https://www.seeedstudio.com/OV5647-69-1-FOV-Camera-module-for-Raspberry-Pi-3B-4B-p-5484.html) and included FPC cable - A USB-C cable - - A Linux/Windows-based PC on an x86 archiecture. + - An x86 based Linux machine or a machine running Apple Silicon author_primary: Chaodong Gong, Alex Su, Kieran Hejmadi ### Tags -skilllevels: Beginner +skilllevels: Introductory subjects: ML armips: - - Cortex M55 - - Ethos U55 + - Cortex-M55 + - Ethos-U55 tools_software_languages: - Himax SDK - - Bash + - Python operatingsystems: - Linux - - Windows + - macOS + +draft: true +cascade: + draft: true ### FIXED, DO NOT MODIFY diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/_review.md b/content/learning-paths/microcontrollers/yolo-on-himax/_review.md index 27a46683f1..98489df86d 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/_review.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/_review.md @@ -2,13 +2,31 @@ review: - questions: question: > - The Grove Vision AI V2 Module can run Yolov8 model in real time? + The Grove Vision AI V2 Module can run YOLOv8 model in real time answers: - True - False - correct_answer: 1 + correct_answer: 1 explanation: > - The Grove Vision AI V2 Module can run object detection in real time using the Cortex-M55 and Ethos-U55. + The Grove Vision AI V2 Module can run object detection in real time thanks to it's ML accelerated capabilites. + question: > + Which of the options is the YOLO model unable to run? + answers: + - Pose detection + - Object detection + - Speech-to-text transcription + correct_answer: 3 + explanation: > + The YOLO model is a computer vision model, meaning it runs based on images as input. + question: > + What Arm IP on the Seeed Grove Vision AI Module V2 enables you to run ML workloads efficiently? + answers: + - Ethos-U55 + - Cortex-A72 + - Cortex-X4 + correct_answer: 1 + explanation: > + When paired with the low-power Cortex-M55 processor, the Ethos-U55 provides an uplift in ML performance # ================================================================================ diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md index da46268655..7af1eb6ceb 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md @@ -1,46 +1,32 @@ --- -title: Set Up Environment +title: Set up environment weight: 2 ### FIXED, DO NOT MODIFY layout: learningpathall --- -## Set up the Development Environment +# Set up the development environment -### Step 1.1. Install Ubuntu +This learning path has been validated on Ubuntu 22.04 LTS and macOS. -If you are running Windows on your host machine, we recommend using Ubuntu through Windows subsystem for Linux 2 (WSL2). Please see [this learning path](https://learn.arm.com/learning-paths/laptops-and-desktops/wsl2/setup/) for assistance +{{% notice %}} +If you are running Windows on your host machine, you can use Ubuntu through Windows subsystem for Linux 2 (WSL2). Check out [this learning path](https://learn.arm.com/learning-paths/laptops-and-desktops/wsl2/setup/) to get started. +{{% /notice %}} -This learning path has been validated on Ubuntu 22.04 LTS. However, we expect other linux distributions to work. To verify the Linux distribution you are using you can run the `cat /etc/*release*` command. +## Install Python, pip and git +You will use Python to build the firmware image and pip to install some dependencies. Verify Python is installed by running ```bash -cat /etc/*release* -``` -The top lines from the terminal output will show the distribution version. - -```output -DISTRIB_ID=Ubuntu -DISTRIB_RELEASE=22.04 -DISTRIB_CODENAME=jammy -DISTRIB_DESCRIPTION="Ubuntu 22.04.5 LTS" -... +python3 --version ``` -### Step 1.2. (Optional) Install Microsoft Visual Studio Code - -This is only optional. You can use any text editor you are comfortable with to view or edit code. By typing “wsl” in VS Code terminal, you can switch to Linux environment. - -### Step 1.3. Install python 3 - -Go to website python.org to download and install. -Verify python is installed by -python3 --version You should see an output like the following. ```output Python 3.12.7 ``` -### Step 1.4. Install python-pip + +Install `pip` with the following commands, and check the output to verify it's installed correctly. ```bash sudo apt update @@ -48,21 +34,39 @@ sudo apt install python3-pip -y pip3 --version ``` -If `pip3` is correctly installed you should see an output similar to tht following. +```output +pip 24.2 from //pip (python 3.12) +``` + +You will need to have the git version control system installed. Run the command below to verify that git is installed on your system. + +```bash +git --version +``` + +You should see output similar to that below. ```output -pip 24.2 from /pip (python 3.12) +git version 2.39.3 ``` -### Step 1.5. Install make +## Install make + +Install the make build tool, which is used to build the firmware in the next section. -You will need to install the make build tool in order to build the firmware in the following section. +### Linux ```bash sudo apt update sudo apt install make -y ``` +### macOS + +```console +brew install make +``` + Successful installation of make will show the following when the `make --version` command is run. ```output @@ -74,15 +78,32 @@ License GPLv3+: GNU GPL version 3 or later This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. ``` +{{% notice Note %}} +To run this learning path on macOS, you need to verify that your installation is for the GNU Make - not the BSD version. +{{% /notice %}} +## Install Arm GNU toolchain + +### Linux -### Step 1.6. Install ARM GNU toolchain +The toolchain is used to cross-compile from the host architecture (x86) to the embedded device architecture (AArch64). ```bash -cd ~ +cd $HOME wget https://developer.arm.com/-/media/Files/downloads/gnu/13.2.rel1/binrel/arm-gnu-toolchain-13.2.rel1-x86_64-arm-none-eabi.tar.xz tar -xvf arm-gnu-toolchain-13.2.rel1-x86_64-arm-none-eabi.tar.xz export PATH="$HOME/arm-gnu-toolchain-13.2.Rel1-x86_64-arm-none-eabi/bin/:$PATH" ``` +### macOS +```console +cd $HOME +wget https://developer.arm.com/-/media/Files/downloads/gnu/13.3.rel1/binrel/arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi.tar.xz +tar -xvf arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi.tar.xz +export PATH="$HOME/code/tmp/arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi/bin/:$PATH" +``` + +{{% notice %}} +You can add the above command to the `.bashrc` file. This was, the Arm GNU toolchain is configured from new terminal sessions as well. +{{% /notice %}} -Please note: you may want to add the command to your `bashrc` file. This enables the Arm GNU toolchain to be easily accessed from any new terminal session. +Now that your development environment is set up, move on to the next section where you will generate the firmware image. \ No newline at end of file diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md index 388c390f06..714961dc13 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md @@ -1,39 +1,25 @@ --- -title: Build The Firmware +title: Build the firmware weight: 3 ### FIXED, DO NOT MODIFY layout: learningpathall --- -## Build The Firmware +TODO short intro / framing? -Next, we need to build an image that contains the embedded software (firmware). You will need to have the git version control system installed. Run the command below to verify that git is installed on your system. +## Clone the Himax project -```bash -git --version -``` - -You should see output similar to that below. - -```output -git version 2.39.3 -``` - -If not, please follow the steps to install git on your system. - -### Step 2.1. Clone the Himax project - -You will first need to recusively clone the Himax repository. This will also clone the necessary sub repos such as Arm CMSIS. +Himax has set up a repository containing a few examples for the Seeed Grove Vision AI V2 board. By recursively cloning the Himax examples repo, git will include the necessary sub-repositories that have been configured for the project. ```bash git clone --recursive https://github.com/HimaxWiseEyePlus/Seeed_Grove_Vision_AI_Module_V2.git cd Seeed_Grove_Vision_AI_Module_V2 ``` -### Step 2.2. Compile the Firmware +## Compile the firmware -The make build tool is used to compile the source code. This should take up around 2-3 minutes depending on the number of CPU cores available. +The make build tool is used to compile the source code. This should take up to 10 minutes depending on the number of CPU cores available. ```bash cd EPII_CM55M_APP_S @@ -41,20 +27,32 @@ make clean make ``` +## Generate the firmware image -### Step 2.3. Generate a Firmware Image +The examples repository contains scripts to generate the image. ```bash cd ../we2_image_gen_local/ cp ../EPII_CM55M_APP_S/obj_epii_evb_icv30_bdv10/gnu_epii_evb_WLCSP65/EPII_CM55M_gnu_epii_evb_WLCSP65_s.elf input_case1_secboot/ +``` + +## Linux + +```bash ./we2_local_image_gen project_case1_blp_wlcsp.json ``` -Your terminal output should end with the following. +## macOS +```console +./we2_local_image_gen_macOS_arm64 project_case1_blp_wlcsp.json +``` +Your terminal output should end with the following. ```output Output image: output_case1_sec_wlcsp/output.img Output image: output_case1_sec_wlcsp/output.img IMAGE GEN DONE ``` + +With this step, you are ready to flash the image onto the Himax development board. \ No newline at end of file diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md index b8fde69fd0..06d110108b 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md @@ -1,46 +1,48 @@ --- -title: Flash Firmware onto the Microcontroller +title: Flash firmware onto the microcontroller weight: 3 ### FIXED, DO NOT MODIFY layout: learningpathall --- -## Flash the Firmware +Now that we have generated a firmware file on our local machine, we need to flash the microcontroller with this firmware. -Now that we have generated a firmware file on our local machine, we need to flash the microcontroller with this firmware. +## Install xmodem -### Step 3.1. Install xmodem. - -`Xmodem` is a basic file transfer protocol. Run the following command to install the dependencies for xmodem. +`Xmodem` is a basic file transfer protocol. Run the following command to install the dependencies for xmodem. If you cloned the repository to a different location replace $HOME with the path. ```bash -cd $HOME/Seeed_Grove_Vision_AI_Module_V2 # If you cloned the repo to a different location replace $HOME with the path. +cd $HOME/Seeed_Grove_Vision_AI_Module_V2 pip install -r xmodem/requirements.txt ``` -### Step 3.2. Connect the module to PC by USB cable. +## Connect the module -You will need to insert the FPC cable cable into the Grove Vision AI V2 module. Lift the dark grey latch on the connector as per the image below. +Insert the FPC cable cable into the Grove Vision AI V2 module. Lift the dark grey latch on the connector as per the image below. ![unlatched](./unlatched.jpg) -Then, slide the FPC connector in with the metal pins facing down and close the dark grey latch to fasten the connector. +Then, slide the FPC connector in with the metal pins facing down and close the dark grey latch to fasten the connector. ![latched](./latched.jpg) Then connect the Groove Vision AI V2 Module to your computer via the USB-C cable. -### Step 3.4. Flash the firmware onto the moule. +### Flash the firmware onto the module -Run the python script below to flash the firmware. +Run the python script below to flash the firmware. -```python -python xmodem\xmodem_send.py --port=[your COM number] --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img +```bash +python xmodem\xmodem_send.py --port=[your COM number] --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img ``` - Note: If running one of the other example models demonstrated in '(Optional) Try Different Models', the command might be slightly different. +{{% notice Note %}} +When you run other example models demonstrated in the later section [Object detection and additional models](/learning-paths/microcontrollers/yolo-on-himax/how-to-5/), you need to adapt this command. +{{% /notice %}} + +TODO: how will the command really change? How to find COM number? -After the firmware image burning is completed, the message "Do you want to end file transmission and reboot system? (y)" is displayed. Press the reset button on the module as per the image below. +After the firmware image burning is completed, the message `Do you want to end file transmission and reboot system? (y)` is displayed. Press the reset button indicated in the image below. ![reset button](./reset_button.jpg) diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md index 2aa5632762..522f23f29b 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md @@ -1,5 +1,5 @@ --- -title: Run and View Model Results +title: Run and view model results weight: 3 ### FIXED, DO NOT MODIFY @@ -7,20 +7,29 @@ layout: learningpathall --- -### Step 4.1. Connect module to PC with USB cable. +## Connect the board with USB cable -Exit the terminal session and connect the module to the PC via your USB-C cable. +Exit the terminal session and connect the module to your host machine using the USB-C cable. -### Step 4.2. Download the Himax AI web toolkit. +## Download the Himax AI web toolkit -The Himax AI web toolkit enables a browser-based graphical user interface (GUI) for the live camera feed. +The Himax AI web toolkit enables a browser-based graphical user interface (GUI) for the live camera feed. -Download the Himax AI Web toolkit by clicking on this [link](https://github.com/HimaxWiseEyePlus/Seeed_Grove_Vision_AI_Module_V2/releases/download/v1.1/Himax_AI_web_toolkit.zip) +```bash +wget https://github.com/HimaxWiseEyePlus/Seeed_Grove_Vision_AI_Module_V2/releases/download/v1.1/Himax_AI_web_toolkit.zip +unzip Himax_AI_web_toolkit.zip +``` -Unzip the archived file and double click `index.html`. This will open the GUI within your default browser. +Open the unzipped directory in your file browsing system and double click `index.html`. This will open the GUI within your default browser. -### Step 4.3. Connect to the Grove Vision AI +## Connect to the Grove Vision AI -Select 'Grove Vision AI(V2)' in the top-right hand corner and press connect button. +Select `Grove Vision AI(V2)` in the top-right hand corner and press `Connect` button. Follow the instructions to set up the connection. ![Himax web UI](./himax_web_ui.jpg) + +The image will run the YOLOv8 on your device. By using the camera to identify things from the [Common Objects in Context (COCO) dataset](https://cocodataset.org/#home), which the model has been trained on, you can put the it to the test. Get some common objects ready and move on to the next section. + +## View model results + +TODO have running section here? \ No newline at end of file diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md index cb6ad788f2..9f3cfde92d 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md @@ -1,21 +1,25 @@ --- -title: (Optional) Try Different Models +title: Object detection and additional models weight: 5 ### FIXED, DO NOT MODIFY layout: learningpathall --- +TODO some more intro here, and showing how to test the model +Also double check this section -### Modify the makefile +## Modify the Makefile -Change the directory to the where the makefile is located. +TODO: why are we doing this? + +Change the directory to the where the Makefile is located. If you cloned the repository to a different location, replace $HOME with the path. ```bash -cd $HOME/Seeed_Grove_Vision_AI_Module_V2/EPII_CM55M_APP_S/ # replace $HOME with the location of the project +cd $HOME/Seeed_Grove_Vision_AI_Module_V2/EPII_CM55M_APP_S/ ``` -Using a text editor, for example visual studio code or nano, modify the `APP_TYPE` field in the makefile from the default value of `allon_sensor_tflm` to one of the values in the table below +Modify the `APP_TYPE` field in the Makefile from the default value of `allon_sensor_tflm` to one of the values in the table below |APP_TYPE =|Description| @@ -24,18 +28,18 @@ Using a text editor, for example visual studio code or nano, modify the `APP_TYP |tflm_folov8_pose|Pose detection| |tflm_fd_fm|Face detection| -### Regenerate the Firmware Image +## Regenerate the firmware image -Go back to the 'Build The Firmware' section and start from Step 3.2. to regenerate the firmware image. +Go back to the [Flash firmware onto the microcontroller](/learning-paths/microcontrollers/yolo-on-himax/how-to-3/) section and run the python command to regenerate the firmware image. -The images below are examples images from the model. +The images below are examples images from the model. -#### Objection Detection +### Objection detection ![object_detection](./object_detection.jpg) -#### Pose Estimation +### Pose estimation ![Pose estimation](./pose_estimation.jpg) -#### Face Detection +### Face detection ![object_detection](./face_detection.jpg) From b8347b3022e3a788fc4a28cc075ec05c73eb1f67 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Mon, 25 Nov 2024 14:12:59 +0100 Subject: [PATCH 2/7] Update YOLOv8 on Himax Learning Path --- .../microcontrollers/yolo-on-himax/_index.md | 2 +- .../yolo-on-himax/how-to-1.md | 19 +++++- .../yolo-on-himax/how-to-2.md | 23 +++++-- .../yolo-on-himax/how-to-3.md | 65 ++++++++++++++++--- .../yolo-on-himax/how-to-4.md | 7 +- .../yolo-on-himax/how-to-5.md | 25 ++++--- 6 files changed, 108 insertions(+), 33 deletions(-) diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md index 4a5167d254..1dcf71ef7a 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md @@ -6,7 +6,7 @@ minutes_to_complete: 90 who_is_this_for: This is an introduction topic for beginners on how to run a computer vision application on an embedded device from Himax. This example uses an off-the-shelf Himax WiseEye2 module which is based on the Arm Cortex-M55 and Ethos-U55. learning_objectives: - - Run a you-only-look-once (YOLO) computer vision model on the edge device + - Run a you-only-look-once (YOLO) object detection model on the edge device - Build the Himax Software Development Kit (SDK) and generate the firmware image file - Update the firmware on the edge device (Himax WiseEye2) diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md index 7af1eb6ceb..cd5ceb2f83 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md @@ -26,11 +26,15 @@ You should see an output like the following. Python 3.12.7 ``` -Install `pip` with the following commands, and check the output to verify it's installed correctly. +Install `pip` and `venv` with the following commands. ```bash sudo apt update -sudo apt install python3-pip -y +sudo apt install python3-pip python3-venv -y +``` + +check the output to verify `pip` is installed correctly. +``` pip3 --version ``` @@ -38,6 +42,15 @@ pip3 --version pip 24.2 from //pip (python 3.12) ``` +It is considered good practice to manage `pip` packages through a virtual environment. Create one with the steps below. + +```bash +python3 -m venv $HOME/yolo-venv +source $HOME/yolo-venv/bin/activate +``` + +Your terminal displays `(yolo-venv)` in the prompt indicating the virtual environment is active. + You will need to have the git version control system installed. Run the command below to verify that git is installed on your system. ```bash @@ -98,7 +111,7 @@ export PATH="$HOME/arm-gnu-toolchain-13.2.Rel1-x86_64-arm-none-eabi/bin/:$PATH" cd $HOME wget https://developer.arm.com/-/media/Files/downloads/gnu/13.3.rel1/binrel/arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi.tar.xz tar -xvf arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi.tar.xz -export PATH="$HOME/code/tmp/arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi/bin/:$PATH" +export PATH="$HOME/arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi/bin/:$PATH" ``` {{% notice %}} diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md index 714961dc13..6d5377363e 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md @@ -6,11 +6,11 @@ weight: 3 layout: learningpathall --- -TODO short intro / framing? +This section will walk you though the process of generating the firmware image file. ## Clone the Himax project -Himax has set up a repository containing a few examples for the Seeed Grove Vision AI V2 board. By recursively cloning the Himax examples repo, git will include the necessary sub-repositories that have been configured for the project. +Himax has set up a repository containing a few examples for the Seeed Grove Vision AI V2 board. It contains third-party software and scripts to build and flash the image with the object detection application. By recursively cloning the Himax examples repo, git will include the necessary sub-repositories that have been configured for the project. ```bash git clone --recursive https://github.com/HimaxWiseEyePlus/Seeed_Grove_Vision_AI_Module_V2.git @@ -19,35 +19,46 @@ cd Seeed_Grove_Vision_AI_Module_V2 ## Compile the firmware -The make build tool is used to compile the source code. This should take up to 10 minutes depending on the number of CPU cores available. +For the object detection to activate, you need to edit the project's `makefile`, located in the `EPII_CM55M_APP_S` directory. ```bash cd EPII_CM55M_APP_S +``` +Open the file and scroll down until you find the `APP_TYPE` attribute. Update the value to `tflm_yolov8_od`. + +```output +APP_TYPE = tflm_yolov8_od +``` +Use the `make` build tool to compile the source code. This should take up to 10 minutes depending on the number of CPU cores available on your host machine. The result is an `.elf` file written to the directory below. + +```bash make clean make ``` ## Generate the firmware image -The examples repository contains scripts to generate the image. +Copy the `.elf` file to the `input_case1_secboot` directory. ```bash cd ../we2_image_gen_local/ cp ../EPII_CM55M_APP_S/obj_epii_evb_icv30_bdv10/gnu_epii_evb_WLCSP65/EPII_CM55M_gnu_epii_evb_WLCSP65_s.elf input_case1_secboot/ ``` +The examples repository contains scripts to generate the image. Run the script corresponding to the OS of your host machine. -## Linux +### Linux ```bash ./we2_local_image_gen project_case1_blp_wlcsp.json ``` -## macOS +### macOS ```console ./we2_local_image_gen_macOS_arm64 project_case1_blp_wlcsp.json ``` Your terminal output should end with the following. + ```output Output image: output_case1_sec_wlcsp/output.img Output image: output_case1_sec_wlcsp/output.img diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md index 06d110108b..67a3f530ab 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md @@ -1,16 +1,16 @@ --- title: Flash firmware onto the microcontroller -weight: 3 +weight: 4 ### FIXED, DO NOT MODIFY layout: learningpathall --- -Now that we have generated a firmware file on our local machine, we need to flash the microcontroller with this firmware. +Now that you have generated an image file on the local host machine, you are ready to flash the microcontroller with this firmware. ## Install xmodem -`Xmodem` is a basic file transfer protocol. Run the following command to install the dependencies for xmodem. If you cloned the repository to a different location replace $HOME with the path. +`Xmodem` is a basic file transfer protocol which is easily installed using the Himax examples repository. Run the following command to install the dependency. If you cloned the repository to a different location, replace $HOME with the path. ```bash cd $HOME/Seeed_Grove_Vision_AI_Module_V2 @@ -19,7 +19,7 @@ pip install -r xmodem/requirements.txt ## Connect the module -Insert the FPC cable cable into the Grove Vision AI V2 module. Lift the dark grey latch on the connector as per the image below. +Insert the Flexible printed circuit (FPC) into the Grove Vision AI V2 module. Lift the dark grey latch on the connector as per the image below. ![unlatched](./unlatched.jpg) @@ -29,20 +29,67 @@ Then, slide the FPC connector in with the metal pins facing down and close the d Then connect the Groove Vision AI V2 Module to your computer via the USB-C cable. -### Flash the firmware onto the module +{{% notice Note %}} +The development board may have two USB-C connectors. If you are running into issues connecting the board in the next step, make sure you are using the right one. +{{% /notice %}} -Run the python script below to flash the firmware. +## Find the COM port + +You'll need to provide the communication port (COM) which the board is connected to in order to flash the image. There are commands to list all COMs available on your machine. Once your board is connected through USB, it'll show up in this list. The COM identifier will start with **tty**, which may help you determine which one it is. You can run the command before and after plugging in the board if you are unsure. + +### Linux ```bash -python xmodem\xmodem_send.py --port=[your COM number] --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img +sudo grep -i 'tty' /var/log/dmesg +``` + +### MacOS + +```console +ls /dev/tty.* ``` {{% notice Note %}} -When you run other example models demonstrated in the later section [Object detection and additional models](/learning-paths/microcontrollers/yolo-on-himax/how-to-5/), you need to adapt this command. +If the port seems unavailable, try changing the permissions temporarily using the `chmod` command. Be sure to reset them afterwards, as this may pose a computer security vulnerability. + +```bash +chmod 0777 +``` {{% /notice %}} -TODO: how will the command really change? How to find COM number? +The full path to the port is needed in the next step, so be sure to note it down. + +## Flash the firmware onto the module + +Run the python script below to flash the firmware. + +```bash +python xmodem\xmodem_send.py --port= --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img +``` + +{{% notice Note %}} +When you run other example models demonstrated in the later section [Object detection and additional models](/learning-paths/microcontrollers/yolo-on-himax/how-to-5/), you need to adapt this command with the right image file. +{{% /notice %}} After the firmware image burning is completed, the message `Do you want to end file transmission and reboot system? (y)` is displayed. Press the reset button indicated in the image below. ![reset button](./reset_button.jpg) + +## Run the model + +After the reset button is pressed, the board will start inference with the object detection automatically. Observe the output in the terminal to verify that the image is built correctly. If a person is in front of the camera, you should see the `person_score` value go over `100`. + +```output +b'SENSORDPLIB_STATUS_XDMA_FRAME_READY 240' +b'write frame result 0, data size=15284,addr=0x340e04e0' +b'invoke pass' +b'person_score:113' +b'EVT event = 10' +b'SENSORDPLIB_STATUS_XDMA_FRAME_READY 241' +b'write frame result 0, data size=15296,addr=0x340e04e0' +b'invoke pass' +b'person_score:112' +b'EVT event = 10' +``` + +You have now verified that the model works correctly in the firmware. The next step is to view it using a web toolkit. \ No newline at end of file diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md index 522f23f29b..b34e3e9e5c 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md @@ -1,15 +1,12 @@ --- title: Run and view model results -weight: 3 +weight: 5 ### FIXED, DO NOT MODIFY layout: learningpathall --- - -## Connect the board with USB cable - -Exit the terminal session and connect the module to your host machine using the USB-C cable. +In this section, you will view a live camera feed with the ML application running. ## Download the Himax AI web toolkit diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md index 9f3cfde92d..bc84d98418 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md @@ -1,26 +1,22 @@ --- title: Object detection and additional models -weight: 5 +weight: 6 ### FIXED, DO NOT MODIFY layout: learningpathall --- -TODO some more intro here, and showing how to test the model -Also double check this section +There are other computer vision applications to try. In this section, you will re-flash the module with a different one and check the results. ## Modify the Makefile -TODO: why are we doing this? - Change the directory to the where the Makefile is located. If you cloned the repository to a different location, replace $HOME with the path. ```bash cd $HOME/Seeed_Grove_Vision_AI_Module_V2/EPII_CM55M_APP_S/ ``` -Modify the `APP_TYPE` field in the Makefile from the default value of `allon_sensor_tflm` to one of the values in the table below - +The table shows the different options available to use with the web toolkit. Modify the `APP_TYPE` field in the Makefile to one of the values in the table. |APP_TYPE =|Description| |---|---| @@ -28,11 +24,22 @@ Modify the `APP_TYPE` field in the Makefile from the default value of `allon_sen |tflm_folov8_pose|Pose detection| |tflm_fd_fm|Face detection| + ## Regenerate the firmware image -Go back to the [Flash firmware onto the microcontroller](/learning-paths/microcontrollers/yolo-on-himax/how-to-3/) section and run the python command to regenerate the firmware image. +Now you can run `make` to re-generate the `.elf` file. + +```bash +make clean +make +``` +Use the command from [Flash firmware onto the microcontroller](/learning-paths/microcontrollers/yolo-on-himax/how-to-3/) section to run re-generate the firmware image. + +```bash +python xmodem\xmodem_send.py --port= --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img +``` -The images below are examples images from the model. +The images below are captured images from the models run in the toolkit. ### Objection detection ![object_detection](./object_detection.jpg) From 8f790bfadd9c7832676944ebfe087c042121288f Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Thu, 28 Nov 2024 18:23:10 +0100 Subject: [PATCH 3/7] Update YOLOv8 on Himax Learning Path --- .../{how-to-2.md => build-firmware.md} | 4 - .../yolo-on-himax/{how-to-1.md => dev-env.md} | 0 .../{how-to-3.md => flash-and-run.md} | 2 +- .../yolo-on-himax/how-to-4.md | 32 ------- .../yolo-on-himax/how-to-5.md | 52 ----------- .../yolo-on-himax/web-toolkit.md | 92 +++++++++++++++++++ 6 files changed, 93 insertions(+), 89 deletions(-) rename content/learning-paths/microcontrollers/yolo-on-himax/{how-to-2.md => build-firmware.md} (92%) rename content/learning-paths/microcontrollers/yolo-on-himax/{how-to-1.md => dev-env.md} (100%) rename content/learning-paths/microcontrollers/yolo-on-himax/{how-to-3.md => flash-and-run.md} (96%) delete mode 100644 content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md delete mode 100644 content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md create mode 100644 content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md b/content/learning-paths/microcontrollers/yolo-on-himax/build-firmware.md similarity index 92% rename from content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md rename to content/learning-paths/microcontrollers/yolo-on-himax/build-firmware.md index 6d5377363e..d2f5604b99 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-2.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/build-firmware.md @@ -24,11 +24,7 @@ For the object detection to activate, you need to edit the project's `makefile`, ```bash cd EPII_CM55M_APP_S ``` -Open the file and scroll down until you find the `APP_TYPE` attribute. Update the value to `tflm_yolov8_od`. -```output -APP_TYPE = tflm_yolov8_od -``` Use the `make` build tool to compile the source code. This should take up to 10 minutes depending on the number of CPU cores available on your host machine. The result is an `.elf` file written to the directory below. ```bash diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md b/content/learning-paths/microcontrollers/yolo-on-himax/dev-env.md similarity index 100% rename from content/learning-paths/microcontrollers/yolo-on-himax/how-to-1.md rename to content/learning-paths/microcontrollers/yolo-on-himax/dev-env.md diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md b/content/learning-paths/microcontrollers/yolo-on-himax/flash-and-run.md similarity index 96% rename from content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md rename to content/learning-paths/microcontrollers/yolo-on-himax/flash-and-run.md index 67a3f530ab..71e16134fd 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-3.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/flash-and-run.md @@ -92,4 +92,4 @@ b'person_score:112' b'EVT event = 10' ``` -You have now verified that the model works correctly in the firmware. The next step is to view it using a web toolkit. \ No newline at end of file +This means the image works correctly on the device, and the end-to-end flow is complete. \ No newline at end of file diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md deleted file mode 100644 index b34e3e9e5c..0000000000 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-4.md +++ /dev/null @@ -1,32 +0,0 @@ ---- -title: Run and view model results -weight: 5 - -### FIXED, DO NOT MODIFY -layout: learningpathall ---- - -In this section, you will view a live camera feed with the ML application running. - -## Download the Himax AI web toolkit - -The Himax AI web toolkit enables a browser-based graphical user interface (GUI) for the live camera feed. - -```bash -wget https://github.com/HimaxWiseEyePlus/Seeed_Grove_Vision_AI_Module_V2/releases/download/v1.1/Himax_AI_web_toolkit.zip -unzip Himax_AI_web_toolkit.zip -``` - -Open the unzipped directory in your file browsing system and double click `index.html`. This will open the GUI within your default browser. - -## Connect to the Grove Vision AI - -Select `Grove Vision AI(V2)` in the top-right hand corner and press `Connect` button. Follow the instructions to set up the connection. - -![Himax web UI](./himax_web_ui.jpg) - -The image will run the YOLOv8 on your device. By using the camera to identify things from the [Common Objects in Context (COCO) dataset](https://cocodataset.org/#home), which the model has been trained on, you can put the it to the test. Get some common objects ready and move on to the next section. - -## View model results - -TODO have running section here? \ No newline at end of file diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md b/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md deleted file mode 100644 index bc84d98418..0000000000 --- a/content/learning-paths/microcontrollers/yolo-on-himax/how-to-5.md +++ /dev/null @@ -1,52 +0,0 @@ ---- -title: Object detection and additional models -weight: 6 - -### FIXED, DO NOT MODIFY -layout: learningpathall ---- - -There are other computer vision applications to try. In this section, you will re-flash the module with a different one and check the results. - -## Modify the Makefile - -Change the directory to the where the Makefile is located. If you cloned the repository to a different location, replace $HOME with the path. - -```bash -cd $HOME/Seeed_Grove_Vision_AI_Module_V2/EPII_CM55M_APP_S/ -``` - -The table shows the different options available to use with the web toolkit. Modify the `APP_TYPE` field in the Makefile to one of the values in the table. - -|APP_TYPE =|Description| -|---|---| -|tflm_folov8_od|Object detection| -|tflm_folov8_pose|Pose detection| -|tflm_fd_fm|Face detection| - - -## Regenerate the firmware image - -Now you can run `make` to re-generate the `.elf` file. - -```bash -make clean -make -``` -Use the command from [Flash firmware onto the microcontroller](/learning-paths/microcontrollers/yolo-on-himax/how-to-3/) section to run re-generate the firmware image. - -```bash -python xmodem\xmodem_send.py --port= --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img -``` - -The images below are captured images from the models run in the toolkit. - -### Objection detection -![object_detection](./object_detection.jpg) - -### Pose estimation -![Pose estimation](./pose_estimation.jpg) - -### Face detection -![object_detection](./face_detection.jpg) - diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md new file mode 100644 index 0000000000..ce39140aa4 --- /dev/null +++ b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md @@ -0,0 +1,92 @@ +--- +title: (optional) Run additional models in the web toolkit +weight: 6 + +draft: true + +### FIXED, DO NOT MODIFY +layout: learningpathall +--- + +In this section, you will view a live camera feed with the ML application running. + +## Modify the Makefile + +Change the directory to the where the Makefile is located. If you cloned the repository to a different location, replace $HOME with the path. + +```bash +cd $HOME/Seeed_Grove_Vision_AI_Module_V2/EPII_CM55M_APP_S/ +``` + +The table shows the different options available to use with the web toolkit. Modify the `APP_TYPE` field in the `makefile` to one of the values in the table. + +|APP_TYPE |Description | +|--- |--- | +|tflm_yolov8_od |Object detection | +|tflm_yolov8_pose |Pose detection | +|tflm_fd_fm |Face detection | + + +## Regenerate the firmware image + +Now you can run `make` to re-generate the `.elf` file. + +```bash +make clean +make +``` + +Use the commands from [Flash firmware onto the microcontroller](/learning-paths/microcontrollers/yolo-on-himax/flash-and-run/) section to run re-generate the firmware image. + +```bash +cd ../we2_image_gen_local/ +cp ../EPII_CM55M_APP_S/obj_epii_evb_icv30_bdv10/gnu_epii_evb_WLCSP65/EPII_CM55M_gnu_epii_evb_WLCSP65_s.elf input_case1_secboot/ +``` + +### Linux + +```bash +./we2_local_image_gen project_case1_blp_wlcsp.json +``` + +### macOS +```console +./we2_local_image_gen_macOS_arm64 project_case1_blp_wlcsp.json +``` + +Finally, use `xmodem` to flash the image. + +```bash +python xmodem\xmodem_send.py --port= --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img +``` + +Press the reset button when prompted before moving on. + +## Download the Himax AI web toolkit + +The Himax AI web toolkit enables a browser-based graphical user interface (GUI) for the live camera feed. + +```bash +wget https://github.com/HimaxWiseEyePlus/Seeed_Grove_Vision_AI_Module_V2/releases/download/v1.1/Himax_AI_web_toolkit.zip +unzip Himax_AI_web_toolkit.zip +``` + +Open the unzipped directory in your file browsing system and double click `index.html`. This will open the GUI within your default browser. + +## Connect to the Grove Vision AI + +Select `Grove Vision AI(V2)` in the top-right hand corner and press `Connect` button. Follow the instructions to set up the connection. Now you should see a video feed with a bounding box showing identified objects, poses or face detection. + +![Himax web UI](./himax_web_ui.jpg) + +The images below are captured images from the models run in the toolkit. + +### Objection detection +![object_detection](./object_detection.jpg) + +### Pose estimation +![Pose estimation](./pose_estimation.jpg) + +### Face detection +![object_detection](./face_detection.jpg) + From e68e7dbc651e1f67e72ad4a77bacb6ed597b1ab3 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Thu, 28 Nov 2024 20:06:54 +0100 Subject: [PATCH 4/7] Temporarily disable draft --- .../learning-paths/microcontrollers/yolo-on-himax/_index.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md index 1dcf71ef7a..7666663b81 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md @@ -31,9 +31,9 @@ operatingsystems: - Linux - macOS -draft: true -cascade: - draft: true +#draft: true +#cascade: +# draft: true ### FIXED, DO NOT MODIFY From d82be5f61918924f344acb2b748ac1565953c954 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Thu, 28 Nov 2024 20:08:44 +0100 Subject: [PATCH 5/7] Temporarily disable draft --- .../microcontrollers/yolo-on-himax/web-toolkit.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md index ce39140aa4..6df3680264 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md @@ -2,7 +2,7 @@ title: (optional) Run additional models in the web toolkit weight: 6 -draft: true +#draft: true ### FIXED, DO NOT MODIFY layout: learningpathall From a87b4e62b781a8775516201f7493fbde343cafd9 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Fri, 29 Nov 2024 13:26:00 +0100 Subject: [PATCH 6/7] Update web-toolkit.md --- .../yolo-on-himax/web-toolkit.md | 22 +++++++++++++------ 1 file changed, 15 insertions(+), 7 deletions(-) diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md index 6df3680264..0790b4ff35 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md @@ -18,13 +18,18 @@ Change the directory to the where the Makefile is located. If you cloned the rep cd $HOME/Seeed_Grove_Vision_AI_Module_V2/EPII_CM55M_APP_S/ ``` -The table shows the different options available to use with the web toolkit. Modify the `APP_TYPE` field in the `makefile` to one of the values in the table. +The table shows the different options available to use with the web toolkit. Modify the `APP_TYPE` field in the `makefile` to one of the values in the table. Then, to the xmodem argument, pass the `--model` argument. + +|APP_TYPE |Description | --model argument | +|--- |--- |--- +|tflm_yolov8_od |Object detection | model_zoo\tflm_yolov8_od\yolov8n_od_192_delete_transpose_0xB7B000.tflite 0xB7B000 0x00000 | +|tflm_yolov8_pose |Pose detection | model_zoo\tflm_yolov8_pose\yolov8n_pose_256_vela_3_9_0x3BB000.tflite 0x3BB000 0x00000 | +|tflm_fd_fm |Face detection | model_zoo\tflm_fd_fm\0_fd_0x200000.tflite 0x200000 0x00000 model_zoo\tflm_fd_fm\1_fm_0x280000.tflite 0x280000 0x00000 model_zoo\tflm_fd_fm\2_il_0x32A000.tflite 0x32A000 0x00000 | + +{{% notice Note %}} +For `tflm_fd_fm`, you need to pass all three models as separate `--model` arguments. +{{% /notice %}} -|APP_TYPE |Description | -|--- |--- | -|tflm_yolov8_od |Object detection | -|tflm_yolov8_pose |Pose detection | -|tflm_fd_fm |Face detection | ## Regenerate the firmware image @@ -57,7 +62,10 @@ cp ../EPII_CM55M_APP_S/obj_epii_evb_icv30_bdv10/gnu_epii_evb_WLCSP65/EPII_CM55M_ Finally, use `xmodem` to flash the image. ```bash -python xmodem\xmodem_send.py --port= --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img +python xmodem\xmodem_send.py --port= \ +--baudrate=921600 --protocol=xmodem \ +--file=we2_image_gen_local\output_case1_sec_wlcsp\output.img \ +--model= ``` Press the reset button when prompted before moving on. From b783561e0ad2d3a8b8292ed8e1840e9c71a5261c Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Mon, 2 Dec 2024 12:22:04 +0100 Subject: [PATCH 7/7] Update web-toolkit.md - Additional use-cases verified - Misc minor updates - Remove _review.md --- .../microcontrollers/yolo-on-himax/_index.md | 6 +-- .../microcontrollers/yolo-on-himax/_review.md | 38 ------------------- .../yolo-on-himax/build-firmware.md | 18 ++++----- .../microcontrollers/yolo-on-himax/dev-env.md | 32 +++++++--------- .../yolo-on-himax/flash-and-run.md | 25 ++++++------ .../yolo-on-himax/web-toolkit.md | 30 ++++++--------- 6 files changed, 51 insertions(+), 98 deletions(-) delete mode 100644 content/learning-paths/microcontrollers/yolo-on-himax/_review.md diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md index 7666663b81..1dcf71ef7a 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/_index.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/_index.md @@ -31,9 +31,9 @@ operatingsystems: - Linux - macOS -#draft: true -#cascade: -# draft: true +draft: true +cascade: + draft: true ### FIXED, DO NOT MODIFY diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/_review.md b/content/learning-paths/microcontrollers/yolo-on-himax/_review.md deleted file mode 100644 index 98489df86d..0000000000 --- a/content/learning-paths/microcontrollers/yolo-on-himax/_review.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -review: - - questions: - question: > - The Grove Vision AI V2 Module can run YOLOv8 model in real time - answers: - - True - - False - correct_answer: 1 - explanation: > - The Grove Vision AI V2 Module can run object detection in real time thanks to it's ML accelerated capabilites. - question: > - Which of the options is the YOLO model unable to run? - answers: - - Pose detection - - Object detection - - Speech-to-text transcription - correct_answer: 3 - explanation: > - The YOLO model is a computer vision model, meaning it runs based on images as input. - question: > - What Arm IP on the Seeed Grove Vision AI Module V2 enables you to run ML workloads efficiently? - answers: - - Ethos-U55 - - Cortex-A72 - - Cortex-X4 - correct_answer: 1 - explanation: > - When paired with the low-power Cortex-M55 processor, the Ethos-U55 provides an uplift in ML performance - - -# ================================================================================ -# FIXED, DO NOT MODIFY -# ================================================================================ -title: "Review" # Always the same title -weight: 20 # Set to always be larger than the content in this path -layout: "learningpathall" # All files under learning paths have this same wrapper ---- diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/build-firmware.md b/content/learning-paths/microcontrollers/yolo-on-himax/build-firmware.md index d2f5604b99..ef89acd6d3 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/build-firmware.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/build-firmware.md @@ -34,24 +34,24 @@ make ## Generate the firmware image -Copy the `.elf` file to the `input_case1_secboot` directory. +The examples repository contains scripts to generate the image file. Copy the `.elf` file to the `input_case1_secboot` directory. ```bash cd ../we2_image_gen_local/ cp ../EPII_CM55M_APP_S/obj_epii_evb_icv30_bdv10/gnu_epii_evb_WLCSP65/EPII_CM55M_gnu_epii_evb_WLCSP65_s.elf input_case1_secboot/ ``` -The examples repository contains scripts to generate the image. Run the script corresponding to the OS of your host machine. -### Linux +Run the script corresponding to the OS of your host machine. This will create a file named `output.img` in the `output_case1_sec_wlcsp` directory. -```bash -./we2_local_image_gen project_case1_blp_wlcsp.json -``` -### macOS -```console +{{< tabpane code=true >}} + {{< tab header="Linux" language="shell">}} +./we2_local_image_gen project_case1_blp_wlcsp.json + {{< /tab >}} + {{< tab header="MacOS" language="shell">}} ./we2_local_image_gen_macOS_arm64 project_case1_blp_wlcsp.json -``` + {{< /tab >}} +{{< /tabpane >}} Your terminal output should end with the following. diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/dev-env.md b/content/learning-paths/microcontrollers/yolo-on-himax/dev-env.md index cd5ceb2f83..450297b752 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/dev-env.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/dev-env.md @@ -67,18 +67,15 @@ git version 2.39.3 Install the make build tool, which is used to build the firmware in the next section. -### Linux - -```bash +{{< tabpane code=true >}} + {{< tab header="Linux" language="shell">}} sudo apt update sudo apt install make -y -``` - -### macOS - -```console + {{< /tab >}} + {{< tab header="MacOS" language="shell">}} brew install make -``` + {{< /tab >}} +{{< /tabpane >}} Successful installation of make will show the following when the `make --version` command is run. @@ -96,26 +93,25 @@ To run this learning path on macOS, you need to verify that your installation is {{% /notice %}} ## Install Arm GNU toolchain -### Linux - -The toolchain is used to cross-compile from the host architecture (x86) to the embedded device architecture (AArch64). +The toolchain is used to compile code from the host to the embedded device architecture. -```bash +{{< tabpane code=true >}} + {{< tab header="Linux" language="shell">}} cd $HOME wget https://developer.arm.com/-/media/Files/downloads/gnu/13.2.rel1/binrel/arm-gnu-toolchain-13.2.rel1-x86_64-arm-none-eabi.tar.xz tar -xvf arm-gnu-toolchain-13.2.rel1-x86_64-arm-none-eabi.tar.xz export PATH="$HOME/arm-gnu-toolchain-13.2.Rel1-x86_64-arm-none-eabi/bin/:$PATH" -``` -### macOS -```console + {{< /tab >}} + {{< tab header="MacOS" language="shell">}} cd $HOME wget https://developer.arm.com/-/media/Files/downloads/gnu/13.3.rel1/binrel/arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi.tar.xz tar -xvf arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi.tar.xz export PATH="$HOME/arm-gnu-toolchain-13.3.rel1-darwin-arm64-arm-none-eabi/bin/:$PATH" -``` + {{< /tab >}} +{{< /tabpane >}} {{% notice %}} -You can add the above command to the `.bashrc` file. This was, the Arm GNU toolchain is configured from new terminal sessions as well. +You can add the `export` command to the `.bashrc` file. This was, the Arm GNU toolchain is configured from new terminal sessions as well. {{% /notice %}} diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/flash-and-run.md b/content/learning-paths/microcontrollers/yolo-on-himax/flash-and-run.md index 71e16134fd..78d03dcf5b 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/flash-and-run.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/flash-and-run.md @@ -19,7 +19,7 @@ pip install -r xmodem/requirements.txt ## Connect the module -Insert the Flexible printed circuit (FPC) into the Grove Vision AI V2 module. Lift the dark grey latch on the connector as per the image below. +To prepare for the next steps, it's time to get the board set up. Insert the Flexible printed circuit (FPC) into the Grove Vision AI V2 module. Lift the dark grey latch on the connector as per the image below. ![unlatched](./unlatched.jpg) @@ -27,7 +27,7 @@ Then, slide the FPC connector in with the metal pins facing down and close the d ![latched](./latched.jpg) -Then connect the Groove Vision AI V2 Module to your computer via the USB-C cable. +Now you can connect the Groove Vision AI V2 Module to your computer via the USB-C cable. {{% notice Note %}} The development board may have two USB-C connectors. If you are running into issues connecting the board in the next step, make sure you are using the right one. @@ -37,17 +37,16 @@ The development board may have two USB-C connectors. If you are running into iss You'll need to provide the communication port (COM) which the board is connected to in order to flash the image. There are commands to list all COMs available on your machine. Once your board is connected through USB, it'll show up in this list. The COM identifier will start with **tty**, which may help you determine which one it is. You can run the command before and after plugging in the board if you are unsure. -### Linux -```bash +{{< tabpane code=true >}} + {{< tab header="Linux" language="shell">}} sudo grep -i 'tty' /var/log/dmesg -``` - -### MacOS - -```console + {{< /tab >}} + {{< tab header="MacOS" language="shell">}} ls /dev/tty.* -``` + {{< /tab >}} +{{< /tabpane >}} + {{% notice Note %}} If the port seems unavailable, try changing the permissions temporarily using the `chmod` command. Be sure to reset them afterwards, as this may pose a computer security vulnerability. @@ -64,11 +63,13 @@ The full path to the port is needed in the next step, so be sure to note it down Run the python script below to flash the firmware. ```bash -python xmodem\xmodem_send.py --port= --baudrate=921600 --protocol=xmodem --file=we2_image_gen_local\output_case1_sec_wlcsp\output.img +python xmodem\xmodem_send.py --port= \ +--baudrate=921600 --protocol=xmodem \ +--file=we2_image_gen_local\output_case1_sec_wlcsp\output.img ``` {{% notice Note %}} -When you run other example models demonstrated in the later section [Object detection and additional models](/learning-paths/microcontrollers/yolo-on-himax/how-to-5/), you need to adapt this command with the right image file. +When you run other example models demonstrated in the later section [Run additional models in the web toolkit](/learning-paths/microcontrollers/yolo-on-himax/web-toolkit/), you need to adapt this command with `--model` argument. {{% /notice %}} After the firmware image burning is completed, the message `Do you want to end file transmission and reboot system? (y)` is displayed. Press the reset button indicated in the image below. diff --git a/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md index 0790b4ff35..f3ec67106b 100644 --- a/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md +++ b/content/learning-paths/microcontrollers/yolo-on-himax/web-toolkit.md @@ -1,14 +1,12 @@ --- -title: (optional) Run additional models in the web toolkit +title: Run additional models in the web toolkit weight: 6 -#draft: true - ### FIXED, DO NOT MODIFY layout: learningpathall --- -In this section, you will view a live camera feed with the ML application running. +In this section, you will view a live camera feed with a computer vision application running. ## Modify the Makefile @@ -18,12 +16,11 @@ Change the directory to the where the Makefile is located. If you cloned the rep cd $HOME/Seeed_Grove_Vision_AI_Module_V2/EPII_CM55M_APP_S/ ``` -The table shows the different options available to use with the web toolkit. Modify the `APP_TYPE` field in the `makefile` to one of the values in the table. Then, to the xmodem argument, pass the `--model` argument. +The table shows the different options available to use with the web toolkit. Modify the `APP_TYPE` field in the `makefile` to one of the values in the table. Then pass the `--model` argument to the python `xmodem` command. -|APP_TYPE |Description | --model argument | +|APP_TYPE |Description | Model argument | |--- |--- |--- |tflm_yolov8_od |Object detection | model_zoo\tflm_yolov8_od\yolov8n_od_192_delete_transpose_0xB7B000.tflite 0xB7B000 0x00000 | -|tflm_yolov8_pose |Pose detection | model_zoo\tflm_yolov8_pose\yolov8n_pose_256_vela_3_9_0x3BB000.tflite 0x3BB000 0x00000 | |tflm_fd_fm |Face detection | model_zoo\tflm_fd_fm\0_fd_0x200000.tflite 0x200000 0x00000 model_zoo\tflm_fd_fm\1_fm_0x280000.tflite 0x280000 0x00000 model_zoo\tflm_fd_fm\2_il_0x32A000.tflite 0x32A000 0x00000 | {{% notice Note %}} @@ -47,17 +44,17 @@ Use the commands from [Flash firmware onto the microcontroller](/learning-paths/ cd ../we2_image_gen_local/ cp ../EPII_CM55M_APP_S/obj_epii_evb_icv30_bdv10/gnu_epii_evb_WLCSP65/EPII_CM55M_gnu_epii_evb_WLCSP65_s.elf input_case1_secboot/ ``` +Run the script corresponding to the OS of your host machine. -### Linux - -```bash +{{< tabpane code=true >}} + {{< tab header="Linux" language="shell">}} ./we2_local_image_gen project_case1_blp_wlcsp.json -``` - -### macOS -```console + {{< /tab >}} + {{< tab header="MacOS" language="shell">}} ./we2_local_image_gen_macOS_arm64 project_case1_blp_wlcsp.json -``` + {{< /tab >}} +{{< /tabpane >}} + Finally, use `xmodem` to flash the image. @@ -92,9 +89,6 @@ The images below are captured images from the models run in the toolkit. ### Objection detection ![object_detection](./object_detection.jpg) -### Pose estimation -![Pose estimation](./pose_estimation.jpg) - ### Face detection ![object_detection](./face_detection.jpg)