cargo-cloud: Building Rust in the cloud for low-end devices
So far all of the blog posts on this site are written in Markdown, using vim, from Termux, running as an app in Android, on my low-end Honor Pad Tablet. The specs on this thing aren't too great, but for light code/text editing in Vim with a Bluetooth keyboard and headboard mount, it works just fine, I mean you are reading a finished blog post, so it must work okay.
But then I tried to compile a Rust application, in particular Zola (a blogging framework written in Rust) The tablet spent over 20 minutes trying to compile dependencies before it ran out of memory and crashed... As it turns out building Rust programs requires a lot of ram, that this machine just doesn't have.
That's fine, just download a precompiled binary from GitHub you say?
Let's just say that not a lot of repositories compile specifically for an aarch64-linux-android
target
I tried to build a couple of my projects, they also yielded similar results.
hello_world
even took around 10 seconds to compile.
Building for aarch64-linux-android
I quickly realised if I wanted to build for my tablet, it definitely wasn't going to be on my tablet.
So I had a few options:
- Build on my MacBook through docker and transfer the files via SSH to the tablet. 🥱 ❌
- Fork the repository and use a GitHub action to build and release the compiled binary. 🥱 ❌
- Build a custom CLI extension for
cargo
to allow for building and installing a Rust codebase in the cloud, using S3, Docker Lambda and APIGateway Websockets. 👀 ✅
Backend
This fully serverless and event-driven architecture costs nothing to run. All usage falls under the free tier provided by AWS.
The processing starts when the Cargo.toml
, Cargo.lock
and everything under src/
is zipped and uploaded to S3.
The Lambda function that performs the compilation is a docker container that has Rust, the Android SDK and cargo-ndk installed, it is configured with 10GB of RAM and 10GB of local storage (The local storage comes in handy for caching cargo dependencies locally).
When the source zip is successfully uploaded to S3, the Compile Lambda downloads it. It parses various build arguments out of the object's metadata, before moving onto the build stage and subsequently zipping the output for upload to the S3 output bucket.
Throughout each stage of the lambda's lifecycle it is constantly pushing status updates and cargo-build output onto Event Bridge, this enables us to have asynchronous communication with the CLI by proxying via an APIGateway Websocket.
Cargo Cloud CLI
When the CLI binary is installed, it has the name cargo-cloud
this allows it to be run as an extension of cargo; cargo cloud build
.
The CLI zips the files (adhering to the .gitignore) and pushes them to the input S3 bucket. Alongside the zipped data it also writes some metadata onto the S3 object:
target
: the architecture we are trying to build for.features
: the feature flags we want to compile with.binary
: the binary we want to build and install.
Once the upload is successful, the client establishes a WebSocket connection to the API gateway, providing the id
of the build as a query string parameter.
The client is now in constant communication with the lambda that is building the binary. Once the Lambda communicates that the build is finished, the client attempts to download the zipped file from the output bucket, then installs the binary into the ~/.cargo/bin
and runs chmod +x
to make it executable.
Security Concerns
As Rust allows projects to supply a build.rs
file which makes custom code run at build time a malicious party could potentially steal credentials or worse. Because of this concern, I will not be providing this as a service, I will however provide all of the code used in both the Backend and CLI to allow others to host this service themselves.