Rust for IoT: Is It Time?

Exploring Rust on the esp32-c3

Mattia Fiumara
Better Programming

--

Photo by Thomas Bormans on Unsplash

Rust has been around for some time now, and we’ve all heard about the benefits of the language, in particular with regard to memory safety. I’ve written quite some C code in the past years, but always wanted to use a more modern language. So, when I first heard about Rust I hoped this would be the game changer. Now it seems the language is picking up momentum, with Microsoft Azure’s CTO even going so far as to say we should not be starting new projects in C or C++ anymore.

I had some questions:

  • So, what does it look like in practice to start a new project based on Rust in the world of IoT, and specifically, embedded development on bare metal?
  • Is now the right time? Can we start using this promising language for new projects in a production environment, or do we still lack support for critical features? More importantly, how is the development experience in 2022 compared to programming in C?
  • Does Rust make us develop more maintainable applications faster while keeping our code safe?

In this article, I’ll explore these questions and gather my thoughts on the state of Rust for IoT in 2022 by zooming in on one of the most popular hardware platforms of today and (trying) to create a simple IoT application for it.

If you’d like to follow along with the code, the full code listing can be found here: rust-iot-2022.

Hardware and Setup

First things first: we need some hardware. Of all the IoT hardware platforms of today, Espressif’s ESP32 series has got to be one of the most popular platforms around. With a big gathering in the hobbyist community, it has also won solid ground in Industrial IoT environments. I decided to try out the ESP32 C3 development kit, which is a RISC-V-based architecture with Wi-Fi and Bluetooth capabilities.

When starting to look around for support you’ll quickly end up with the ESP IDF (IoT development framework) as the main development framework, which has some solid Rust bindings written for it by the open-source ESP community. To get set up, I simply followed the guide from this template project (ensure you have the dependencies listed there installed in your system) and performed the following commands:

Rust setup and project generation

This bag of commands installs Rust’s nightly toolchain and all the required dependencies needed to compile and run our application. After all, dependencies are installed we generate the demo project, compile, flash, and finally run the application, all within a time span of six minutes. The ease of this setup compared to some project setups I did in C or C++ — where I manually had to google and install the right toolchain — is the first win for Rust and goes to show how well Rust’s build system works.

Development Workflow

For the sake of simplicity, I decided to start off with VScode and installed the Rust analyzer extension, which resulted in an out-of-the-box autocompleter. No more generating your compilation database using compile commands and always missing one of the system libraries: the Rust language server simply does the job and helps you along the way while you get used to the Rust syntax.

Now, let’s see what it’s like navigating the Rust bindings for the ESP-IDF. I read through the documentation and let the Rust language server do its work, suggesting what to fill in for each function argument. To connect to your local WiFi network I ended up with the following:

Fairly straightforward. Let’s see how big our binary becomes and how fast the compile -> flash -> run loop is:

$ time cargo buildCompiling wireprobe-rs v0.1.0 (/Users/mfiumara/repos/esp-rust)Finished dev [optimized + debuginfo] target(s) in 1.27s
1.31 real 1.15 user 0.15 sys
$ time espflash --speed 406800 target/riscv32imc-esp-espidf/debug/wireprobe-rs
Serial port: /dev/tty.usbserial-10
Connecting...
WARN setting baud rate higher than 115200 can cause issues.
Chip type: ESP32-C3 (revision 3)
Crystal frequency: 40MHz
Flash size: 4MB
Features: WiFi
MAC address: 10:91:a8:36:53:04
App/part. size: 1037024/4128768 bytes, 25.12%
[00:00:00] ######################################## 12/12 segment 0x0
[00:00:00] ######################################## 1/1 segment 0x8000
[00:00:25] ######################################## 584/584 segment 0x10000
Flashing has completed!
33.33 real 0.14 user 0.16 sys

Our binary became approximately 1MB and takes around 40 seconds to compile and flash. That’s a pretty big binary image, which can be explained because we linked in the WiFi stack. Regarding the speed of flashing, this is limited by the maximum speed of the UART interface of 406800 bits/second. That’s not amazing, but that’s what we have to deal with: this does not seem to be a limitation of Rust but a limitation of the hardware.

Doing Something Useful

Now that we managed to run our code on the target, let’s try to do something that’s actually useful, for instance, making a connection to an MQTT broker and trying to send and receive some messages. For this purpose, I added these couple of lines of code after initialising the WiFi stack to initialize and start an MQTT client:

I’m using the public broker by emqx.io, which has a nice online client where we can see our messages coming in and where we can send messages to our device using a simple GUI to test our functionality.

Receiving and sending MQTT messages on emqx.io

Now, to actually send and receive messages, I had to do some digging around since I couldn’t really figure out how to implement this from the documentation alone. Luckily, there is a pretty active community chat around the esp-rs bindings where this topic passed by in the chat history. In my first implementation, I was trying to publish messages without handling any incoming events from the MQTT stack, which completely froze my application. With some reference material I came up with the following piece of code:

Looking at this code is a bit daunting at first, but once you read through the syntax this showcases how powerful Rust can be compared to a more traditional C programming workflow:

  1. First, we spawn a thread using std::thread in which we listen for incoming MQTT events. No code was spent on defining stack sizes, configuring thread priority, defining scheduling behavior, defining an entry function, and passing variables. We just thread::spawn, pass a closure, move our connection variable into the thread and we’re set.
  2. In the newly spawned thread, we handle MQTT events and look specifically for a Received event using match. We print the contents as a string in our console, this time taking advantage of std::str. In C, we’d have to do a manual memcpy and not forget the string termination character or risk printing out of our target memory index, illustrating some safety features of Rust.
  3. Finally, back in our main thread, we subscribe to a topic and create an infinite loop in which we continuously publish to an MQTT topic, with pauses of one second in between.

There is room for improvement here: we are not handling any unforeseen errors (simply unwrap()-ing them) and our nested match statement could do with a refactor, but as a proof of concept, this suffices and shows how much we can do in very few lines of code. It almost feels surreal to realize that we’re still programming on a microcontroller, and it feels more like we’re writing a cloud service. This is a good thing.

Is it time?

So… we’ve setup a simple Rust application on an esp32-c3 that successfully sends and receives messages to a cloud application in a pretty short period of time. We used the standard library of Rust to develop code faster while maintaining safety. What we didn’t do: discuss over-the-air updates, security and encryption, debugging capabilities, or memory optimisations, all topics that require to be evaluated before going all-in on Rust (not to mention testing but that’s an article by itself). Looking back at the experience so far, however, and extrapolating from this humble project, I do think that the above topics could be tackled without excessive effort. So I’m calling it:

It is time.

Now, this doesn’t mean that tomorrow you need to start rewriting your whole code base from C / C++ into Rust. This depends greatly on the kind of support that’s available for your chipset: is there an active community / is the source code actively being maintained, is there official backing of the open-source efforts by the manufacturer, and are there good bindings? Then by all means go for it.

There might be cases where you will need to resolve issues on your own and contribute back to the open-source community, but, if you don’t mind that, then adopting Rust now will prove a good return on investment in the long run.

Thanks for reading until the end!

Resources

--

--

I’m Tech Lead at Agurotech, where we create innovative solutions for the agriculture industry. I’m interested in embedded systems, IoT and Rust.