DEV Community

Ryan Dsouza
Ryan Dsouza

Posted on

Rust on Lambda using the CDK

Introduction

AWS introduced a Rust runtime for Lambda and since then you can directly run Rust code in your Lambda function.

In this post, we will explore how to run a Rust function on Lambda which will be deployed using the CDK.

Prerequisites

GitHub logo ryands17 / rust-lambda

This is a CDK project that deploys a Rust function using the Rust runtime for Lambda. This creates a basic handler that logs some data sent as input

Constructs

In this application, we just have a single construct in which we create a Lambda function with a custom runtime.

// lib/rust-lambda-stack.ts

new lambda.Function(this, 'rust-hello', {
  description: 'Deploying a Rust function on Lambda using the custom runtime',
  code: lambda.Code.fromAsset(
    'resources/target/x86_64-unknown-linux-musl/release/lambda'
  ),
  runtime: lambda.Runtime.PROVIDED_AL2,
  handler: 'not.required',
  environment: {
    RUST_BACKTRACE: '1',
  },
  logRetention: RetentionDays.ONE_WEEK,
})
Enter fullscreen mode Exit fullscreen mode

The above snippet creates a new Lambda function where we pass the code from a specific folder. We will see this folder in the section where we deploy the application.

The runtime here is the Custom Runtime provided by Lambda that helps us run our Rust application.

Our code will be an executable file created by Rust, so we do not need to specify the handler specifically as Lambda will execute the file based on what we are providing to the function. Which is why we provide something random like not.required to handler as that's a required prop.

The rest of the options are the environment and logRetention respectively. The RUST_BACKTRACE variable is specified in environment so that we get the entire stack trace of where the error occurred if there was one.

Let's move on to defining the Rust handler where we install necessary dependencies and create the function.

Rust setup

The Cargo.toml file is where we mention the dependencies required and some other metadata. Think of this as a package.json for Rust.

# resources/Cargo.toml

[dependencies]
lambda_runtime = "0.3.0"
log = "0.4.14"
serde_json = "1.0.64"
simple_logger = "1.11.0"
tokio = {version = "1", features = ["full"]}

[[bin]]
name = "bootstrap"
path = "src/main.rs"
Enter fullscreen mode Exit fullscreen mode

In this file, we specify what dependencies we require to run our application and the lambda_runtime dependency is something that will run the handler that we pass to it. We shall also see the rest of the dependencies used in the function.

In the bin section, we denote the name our executable will have after it's built. We name our executable bootstrap as that's what Lambda expects.

Now let's look at the file which contains the function that will be executed.

// resources/src/main.rs

use lambda_runtime::{handler_fn, Context, Error};
use log::LevelFilter;
use serde_json::{json, Value};
use simple_logger::SimpleLogger;

#[tokio::main]
async fn main() -> Result<(), Error> {
  SimpleLogger::new()
    .with_level(LevelFilter::Info)
    .init()
    .unwrap();

  let func = handler_fn(handler);
  lambda_runtime::run(func).await?;
  Ok(())
}
Enter fullscreen mode Exit fullscreen mode

This is the main function that will be the entrypoint in the Lambda. This will either return an empty result (()) or a Lambda Error as defined in the return type.

We have also added an attribute on top of our main function using the tokio crate that helps us make our main function asynchronous.

Then we initialise a logger via simple_logger that helps us log our messages in a better format. These logs will be stored in CloudWatch logs.

The final lines create the handler via handler_fn and pass in a function which we will look at below and then use the lambda_runtime::run that accepts this handler and runs our function.

Notice the await? after the run call and the async before our main function. You could compare this with the async/await flow that we have in Node.js that allows us to wait for Promises in a synchronous manner. This is possible due to the Future trait in Rust.

Finally, let's look at the handler that we pass above to handler_fn.

// resources/src/main.rs

async fn handler(event: Value, _: Context) -> Result<Value, Error> {
  let message = event["message"].as_str().unwrap_or("world");
  let first_name = event["firstName"].as_str().unwrap_or("Anonymous");

  let response = format!("Hello, {}! Your name is {}", message, first_name);
  log::info!("{}", response);

  Ok(json!({ "response": response }))
}
Enter fullscreen mode Exit fullscreen mode

This is a function that returns a Value that is simply JSON which has a response field. The handler takes in parameters that any normal Lambda function does like event and context.

We use the serde_json library to serialise JSON. The event parameter is of type Value that indicates that it is a valid JSON object that could contain any value.

We set the message and first_name variables to the message and firstName values received from event respectively. We check if the value passed is a string and return the string via as_str() and the unwrap_or is a mechanism where we provide a default value in case the value passed by the user was not a string.

We then create a response via the format! macro that creates a string with the values obtained and logs that to the console. This will be visible in CloudWatch under our function's log group.

The final line returns a response object via the json! macro which constructs a JSON object from the values we pass to it.

Deploying the application

And that was it for our application so let's deploy this app via the command yarn deploy or npm run deploy.

This command just runs cdk deploy but before that it builds our Rust application via the following script:

#!/bin/bash

cd resources
cargo build --release --target x86_64-unknown-linux-musl
(cd target/x86_64-unknown-linux-musl/release && mkdir -p lambda && cp bootstrap lambda/)
Enter fullscreen mode Exit fullscreen mode

This runs the cargo build with the release flag so that our executable is optimised and copies it into a folder named lambda. This lambda folder is referenced when creating our Lambda construct as follows:

// lib/rust-lambda-stack.ts

code: lambda.Code.fromAsset(
  'resources/target/x86_64-unknown-linux-musl/release/lambda'
)
Enter fullscreen mode Exit fullscreen mode

On running yarn deploy we will be able to see our function in the console.

Testing the Lambda

Let's run the function by creating a sample test event.

Creating a test event

We pass in the message and firstName and create the test event. Let's invoke this function and check if it has succeeded.

Running the above test event

We can see that the response is returned to us correctly and it's logged in the Log Output as well which will also be visible in CloudWatch Logs for this function.

Let's edit the test event and remove the message field.

Editing the test event

On invoking the function, we will see that it will take the default value that we specified for the message field and return the output as follows:

Running the edited test event

Conclusion

This is how we can run our Rust application on Lambda with the help of a custom runtime and deployed using the CDK.

If you have deployed this stack, do not forget to delete this via yarn cdk destroy.

GitHub logo ryands17 / rust-lambda

This is a CDK project that deploys a Rust function using the Rust runtime for Lambda. This creates a basic handler that logs some data sent as input

Here's the repo if you haven't cloned it again and let me know your thoughts in the comments. Thanks for reading!

Top comments (2)

Collapse
 
mosen profile image
mosen

Thanks Ryan this was pretty great, lambda/rust articles are pretty thin and I'm just weighing up SAM v serverless v CDK deployments as a beginner, so the timing was excellent.

Collapse
 
ryands17 profile image
Ryan Dsouza

Thanks a lot! Will be trying to add a couple more of posts on Rust with the CDK :)