Skip to content
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ weight: 7 # 1 is first, 2 is second, etc.
layout: "learningpathall"
---

TODO connect this part with the FVP/board?
With our environment ready, you can create a simple program to test the setup.

This example defines a small feedforward neural network for a classification task. The model consists of 2 linear layers with ReLU activation in between.
Expand Down Expand Up @@ -62,7 +61,7 @@ print("Model successfully exported to simple_nn.pte")

Run the model from the Linux command line:

```console
```bash
python3 simple_nn.py
```

Expand All @@ -76,15 +75,15 @@ The model is saved as a .pte file, which is the format used by ExecuTorch for de

Run the ExecuTorch version, first build the executable:

```console
```bash
# Clean and configure the build system
(rm -rf cmake-out && mkdir cmake-out && cd cmake-out && cmake ..)

# Build the executor_runner target
cmake --build cmake-out --target executor_runner -j$(nproc)
```

You see the build output and it ends with:
You will see the build output and it ends with:

```output
[100%] Linking CXX executable executor_runner
Expand All @@ -93,7 +92,7 @@ You see the build output and it ends with:

When the build is complete, run the executor_runner with the model as an argument:

```console
```bash
./cmake-out/executor_runner --model_path simple_nn.pte
```

Expand All @@ -112,3 +111,30 @@ Output 0: tensor(sizes=[1, 2], [-0.105369, -0.178723])

When the model execution completes successfully, you’ll see confirmation messages similar to those above, indicating successful loading, inference, and output tensor shapes.



TODO: Debug issues when running the model on the FVP, kindly ignore anything below this
## Running the model on the Corstone-300 FVP


Run the model using:

```bash
FVP_Corstone_SSE-300_Ethos-U55 -a simple_nn.pte -C mps3_board.visualisation.disable-visualisation=1
```

{{% notice Note %}}

-C mps3_board.visualisation.disable-visualisation=1 disables the FVP GUI. This can speed up launch time for the FVP.

The FVP can be terminated with Ctrl+C.
{{% /notice %}}



```output

```


You've now set up your environment for TinyML development, and tested a PyTorch and ExecuTorch Neural Network.
Original file line number Diff line number Diff line change
Expand Up @@ -61,4 +61,4 @@ pkill -f buck

If you don't have the Grove AI vision board, use the Corstone-300 FVP proceed to [Environment Setup Corstone-300 FVP](/learning-paths/microcontrollers/introduction-to-tinyml-on-arm/env-setup-6-fvp/)

If you have the Grove board proceed o to [Setup on Grove - Vision AI Module V2](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm/setup-7-grove/)
If you have the Grove board proceed to [Setup on Grove - Vision AI Module V2](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm/setup-7-grove/)
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,4 @@ Test that the setup was successful by running the `run.sh` script.
./run.sh
```

TODO connect this part to simple_nn.py part?

You will see a number of examples run on the FVP. This means you can proceed to the next section to test your environment setup.
You will see a number of examples run on the FVP. This means you can proceed to the next section [Build a Simple PyTorch Model](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm/build-model-8/) to test your environment setup.
Original file line number Diff line number Diff line change
Expand Up @@ -35,23 +35,16 @@ Grove Vision V2 [Edge impulse Firmware](https://cdn.edgeimpulse.com/firmware/see

![Board connection](Connect.png)

{{% notice Note %}}
Ensure the board is properly connected and recognized by your computer.
{{% /notice %}}

3. In the extracted Edge Impulse firmware, locate and run the installation scripts to flash your device.

```console
./flash_linux.sh
```

4. Configure Edge Impulse for the board
in your terminal, run:

```console
edge-impulse-daemon
```
Follow the prompts to log in.

5. If successful, you should see your Grove - Vision AI Module V2 under 'Devices' in Edge Impulse.


## Next Steps
1. Go to [Build a Simple PyTorch Model](/learning-paths/embedded-and-microcontrollers/introduction-to-tinyml-on-arm/build-model-8/) to test your environment setup.

This file was deleted.

Loading