This is a WIP implementation/exploration of using modern Rust technologies for more convenient programming of Bluepill MCUs, with potential for porting to different boards with minimal effort. The idea is to have an easy to use Rust client for performing remote procedure calls (RPC) with the MCU, which is also what the packet
package does in the original bluepill-dds
project. There are 4 core dependencies used in the project and it might be useful to get to know some of them:
- Embassy is a Rust framework for MCUs providing an async executor for bare metal software. The programs are structured into tasks, with abstractions/HALs for STM32, RP2040, NRF or ESP families
- The Embedded Rust Book a great collection of information about using Rust for embedded. For development, I recommend to at least read about the basic abstractions regarding peripherals,
#[no_std]
, tips for C developers and C interoperability, if you used C before. - Embassy website and Embassy Book most of the stuff above applies to pure embedded Rust, but we have Embassy as well, which provides a very convenient set of abstractions. Checking out their examples on Github is also beneficial.
- Asynchronous Programming in Rust both firmware and host client rely on async code, some basics about how it works in Rust are also of use.
- The Embedded Rust Book a great collection of information about using Rust for embedded. For development, I recommend to at least read about the basic abstractions regarding peripherals,
- Postcard RPC a framework for efficient, type-safe communication with the MCU. Their repo is probably the best place to start, with a very good overview that also explains the project structure.
- PyO3 a library to generate Python bindings from Rust code guide.
- probe-rs used for flashing firmware and debugging code running on MCU.
Other interesting resources:
- Bluepill HAL docs
- Bluepill PAC docs
- Bluepill datasheet
- Workbook for Embedded Workshops
- The Embedonomicon
In comparison to C, there is much more abstraction offered by the Rust ecosystem, in order to deliver convenience and safety. The embassy
framework offers its own embassy-stm32
HAL as well as a crate with structs generated based on actual hardware - PAC.
You can read more here (note that this book uses a different HAL and PAC). Usually, you should rely on the HAL offered abstractions, and resort to embassy-stm32::pac
or cortex-m
crate when necessary. The best point to start is looking at the documentation, linked above.
- Install Rust via rustup and
probe-rs
version0.28.0
via:
powershell -ExecutionPolicy ByPass -c "irm https://github.com/probe-rs/probe-rs/releases/download/v0.28.0/probe-rs-tools-installer.ps1 | iex"
- Install the
uv
package manager for building Python bindings: https://docs.astral.sh/uv/getting-started/installation/ - If developing via WSL or Dev Container, you need to bind and attach the USB Bus via usbipd-win, but everything should work on Windows
This repo is structued into 4 distinct crates that are managed together with a Cargo workspace. To run commands in one of them, just use the -p
flag, for example to compile the firmware:
cargo build -p firmware --release
Unfortunately, due to limitations of cargo, the workspace uses nightly features for multi target integration. Some things do not work perfectly, so we use xtasks
for running commands rather than pure cargo.
Here is a brief description of each crate:
- protocol here we define statically typed message formats, endpoints and topics (streamed data). It is built automatically when building the dependent host and firmware crates.
- firmware MCU codes with different functionalities. Each binary is one firmware and you select which one you want to flash by adding
--bin <name>
to the flash command. - host the host crate where Rust clients are defined and then Python bindings are generated on top of them.
- xtasks Rust written build scripts, kind of like
make
more info - macros For convenience, custom Rust macros (codegen tools) are stored in a separate crate.
- Connect the Bluepill board via ST-LINK, but do not connect the 3V pin (bend it to the side)
- Connect the Bluepill board to the PC via USB.
- Flash it with
cargo xtask flash minimal
or any other binary - Build Python bindings with the Maturin build tool:
cargo xtask pygen
- Test the commands in the
test.py
file. Make sure you use theuv
created local virtual environment.
- Create copies of the
minimal.rs
files inprotocol
,firmware
andhost
crates. - Rename them to the same name that will be the name of your binary from now on.
- Initialize the modules. Add new statements for your binary in host and protocol
- Rename your host client struct and add it to the Python module
- Update the
protocol
imports infirmware
andhost
- Now you can start developing. Create a communication schema in
protocol
and then proceed to implementing logic, handlers infirmware
and callers inhost
Install the probe-rs
VS Code extension and set breakpoints in the code. Go to the firmware binary code file, for example minimal.rs
and run the probe-rs binary
debugger or simply press F5
.
There are some issues to be ironed out in the config or the tool itself though:
If you want to play with raw binaries (for example use the ST-Link companion software or analyze size), you can use some custom utilities from cargo-binutils:
# Install needed only once
cargo install cargo-binutils
# Example commands include size, nm, objdump, strip...
cargo <cmd> -p firmware --bin <binary> --release
- expand the firmware and port more Cube code
- another crate for RP Pico firmware
- create a global defmt logger that sends data via a topic
- fix the
firmware
runner, which is ignored when usingforced-target
, i.e. find a way to make the Cargo workspace work nicer. cargo issue. Another option might be to exclude firmware from workspace and uselinkedProjects
in Rust Analyzer - test multiple connected boards scenario
Currently we run a Tokio event loop inside the Rust binary, which is responsible for all the Rust async logic. As our Python code does not really rely on asyncio
, all the exposed Python binding are synchronous and are converted into such by wrapping every Rust async function body in
pyo3_async_runtimes::tokio::get_runtime().block_on(async move { /* code */ })
This is taken care of by the blocking_async
macro, making all of the Python clients blocking. In the future (hehe), we might want to have non-blocking calls as well, which might be very useful for streams of data. It is also worth noting that IPython supports top-level await. However, the state of async in PyO3 is not ready for that yet:
- tracking issue PyO3/pyo3#1632
- async constructors not supported PyO3/pyo3#5176
- type stub generator ignores async https://github.com/Jij-Inc/pyo3-stub-gen
- there is no convenient API for streams PyO3/pyo3-async-runtimes#35, awestlake87/pyo3-asyncio#17 and tracking issue
- the
pyo3-asyncio
andpyo3-async-runtimes
seem dead. Development seems to be continuing in the main repo, underexperimental-async
flag.
In conclusion it might be better to wait until we really really need async in Python. It is doable, but it might be better to just wait for the design to stabilize.