Creating serverless applications is an exciting realm of software development, promising scalable and cost-effective solutions. In this article, we will build a serverless Lambda function using Rust — a language known for its performance and reliability. We will containerize this function using Docker and deploy it with Terraform, a popular infrastructure-as-code tool. This journey will introduce you to integrating Rust with AWS Lambda, leveraging Docker for deployment, and orchestrating the deployment process with Terraform.
Introduction
Serverless computing allows developers to build and run applications without managing servers. AWS Lambda is one of the most popular platforms for deploying serverless functions, supporting various programming languages. Rust, known for its safety and performance, is an excellent choice for creating reliable and efficient serverless functions.
Why a “Hello World” is Important
Starting with a “Hello World” program is a long-standing tradition in software development. It serves as the simplest version of an application, verifying that the basic setup works correctly. For our serverless function, it ensures that our environment is correctly configured to run Rust code in a Lambda context.
Starting the Project
Prerequisites:
- Rust toolchain installed (rustup)
- Docker and Docker Buildx set up
- AWS CLI configured
- Terraform CLI installed and initialized
- Basic knowledge of Cargo (Rust’s package manager)
To begin, we will create a new Rust project called lambda-hello-world:
cargo new lambda-hello-world
Next, we will add the core functionality in the src/main.rs file:
use lambda_runtime::{service_fn, Error, LambdaEvent};
use serde_json::{json, Value};
#[tokio::main]
async fn main() -> Result<(), Error> {
let func = service_fn(func);
lambda_runtime::run(func).await?;
Ok(())
}
async fn func(event: LambdaEvent<Value>) -> Result<Value, Error> {
let (event, _context) = event.into_parts();
let first_name = event["firstName"].as_str().unwrap_or("world");
Ok(json!({ "message": format!("Hello, {}!", first_name) }))
}
This Rust application is designed to handle JSON input and responds with a personalized greeting.
Dependencies
We need to add several dependencies to our Cargo.toml to support our Lambda function:
[dependencies]
lambda_runtime = "0.11.1"
tokio = { version = "1", features = ["full"] }
serde_json = "1.0"
These dependencies include the AWS Lambda runtime for Rust, asynchronous runtime support with Tokio, and JSON handling with serde_json.
The Dockerfile
Containerizing our Rust application will ensure that it can run in the AWS Lambda environment. Here’s how to setup our Dockerfile:
FROM public.ecr.aws/lambda/provided:al2 as builder
# Install compiler and Rust
RUN yum install -y gcc gcc-c++ openssl-devel
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y
ENV PATH="/root/.cargo/bin:${PATH}"
# Create a new empty shell project
WORKDIR /usr/src/app
COPY . .
# Build the release
RUN rustup target add x86_64-unknown-linux-musl && \
cargo build --release --target x86_64-unknown-linux-musl
# Copy the built executable to the Lambda base image
FROM public.ecr.aws/lambda/provided:al2
COPY --from=builder /usr/src/app/target/x86_64-unknown-linux-musl/release/lambda-hello-world /var/runtime/bootstrap
CMD [ "bootstrap" ]
Here, musl is used to produce a fully statically-linked binary, which ensures compatibility with AWS Lambda’s minimal environment (based on Amazon Linux 2).
How to Build the Image
To build our Docker image, use the following command:
docker build -t lambda-hello-world .
If you are using an ARM Mac like me, you need to build the image in the following way:
docker buildx build --platform=linux/amd64 -t lambda-hello-world .
Pushing the Image
After building the image, we need to push it to AWS Elastic Container Registry (ECR):
aws ecr create-repository --repository-name lambda-hello-world
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin <YOUR-ACCOUNT-ID>.dkr.ecr.us-east-1.amazonaws.com
docker tag lambda-hello-world:latest <YOUR-ACCOUNT-ID>.dkr.ecr.us-east-1.amazonaws.com/lambda-hello-world:latest
docker push <YOUR-ACCOUNT-ID>.dkr.ecr.us-east-1.amazonaws.com/lambda-hello-world:latest
You can also manage the ECR repository using Terraform for full IaC consistency. We used the CLI here for brevity and clarity.
Writing Terraform
To deploy our function with Terraform, we define the AWS provider, the Lambda function, and an IAM role for execution:
provider "aws" {
region = "us-east-1"
}
resource "aws_lambda_function" "test" {
function_name = "rust_hello_world"
package_type = "Image"
image_uri = "<YOUR-ACCOUNT-ID>.dkr.ecr.us-east-1.amazonaws.com/lambda-hello-world:latest"
role = aws_iam_role.lambda_exec.arn
}
resource "aws_iam_role" "lambda_exec" {
name = "lambda_exec_role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Principal = {
Service = "lambda.amazonaws.com"
}
Effect = "Allow"
},
]
})
}
output "lambda_function_name" {
value = aws_lambda_function.test.function_name
}
Suggestion: Consider using input variables for values like the Lambda function name and the image tag (latest) to improve flexibility and support environment-specific configurations.
Deploy the infrastructure:
terraform init
terraform apply
Invoke the Function with AWS CLI
Once deployed, you can invoke the Lambda function using the AWS CLI to see it in action:
aws lambda invoke \
--function-name "$(terraform output -raw lambda_function_name)" \
--cli-binary-format raw-in-base64-out \
--payload '{"firstName":"Mati"}' \
response.json
cat response.json
{"message":"Hello, Mati!"}
Conclusion
In this guide, we walked through creating a serverless Lambda function using the Rust programming language, containerizing it with Docker, and deploying it using Terraform. We also covered the correct way to build and push the Docker container to AWS ECR, and how to set up and execute the Lambda function with an IAM role using Terraform.
The simplicity and power of using Rust for AWS Lambda functions offer a robust solution for building efficient and scalable serverless applications. This setup leverages Rust’s performance benefits and the serverless architecture’s scalability and cost-effectiveness. By integrating these technologies, developers can efficiently manage and deploy high-performance applications.