DEV Community

Cover image for Unlocking Next-Gen Serverless Performance: A Deep Dive into AWS LLRT
Dixit R Jain for AWS Community Builders

Posted on • Updated on

Unlocking Next-Gen Serverless Performance: A Deep Dive into AWS LLRT

Table of Contents

Introduction

The transition to serverless computing signifies a pivotal shift in application deployment, focusing more on code development than on managing infrastructure. This shift underscores the need for innovations that not only reduce latency but also enhance performance. AWS LLRT emerges as a significant advancement in this domain, offering a lightweight JavaScript runtime that outperforms traditional runtimes like Node.js in efficiency and reduced resource utilization. Designed specifically to address the 'cold start' problem in serverless computing, LLRT ensures rapid startup times, thus enhancing user experience and aligning cost-effectiveness with superior performance. This development marks a paradigm shift in application responsiveness to operational demands, prioritizing speed and efficiency in serverless architectures

Key Features

Swift Cold-Starts: LLRT provides a significant boost in performance, offering over 10x faster startup times, which is crucial for serverless applications where cold starts can impact the user experience.

Efficiency: By optimizing for low latency and efficient memory usage, LLRT ensures that applications are not only fast but also cost-effective, with up to 2x lower overall costs.

Specialized for Serverless Workload: LLRT is designed specifically for serverless computing, in contrast to general-purpose runtimes such as Node.js, Bun, or Deno.

LLRT vs. Node.js

The primary distinction between LLRT and Node.js lies in their architecture and design principles. Built on QuickJS, LLRT is significantly lighter than Node.js, contributing to its quicker cold start times and reduced initialization periods. This optimization benefits serverless applications in terms of both performance and cost.

LLRT's design focuses on supporting key Node APIs without aiming for full Node.js API parity. It prioritizes APIs crucial for serverless functions while offering full or partial support for built-in Node libraries, such as buffer, stream, and fetch.

Technically, LLRT distinguishes itself by employing Rust for its performance and safety features and leveraging Tokio for efficient, event-driven, non-blocking I/O operations. This unique combination allows for concurrent task management without the complexities of multithreading, thereby enhancing system resource efficiency.

At its core, LLRT utilizes QuickJS—a compact and fast JavaScript engine that eschews complex JIT compilation in favor of immediate script execution. This characteristic is especially beneficial for serverless functions, which necessitate fast, sporadic execution times.

Benchmarking

Cold-Start Comparison: LLRT demonstrates significantly faster startup times compared to Node.js, indicating a 12x to 18x speed improvement. Below benchmark is captured for Lambda Function doing Dynamo PUT Operation.

LLRT Dynamo PUT Lambda Cold-start

Node.js 20 Dynamo PUT Lambda Cold-start

Warm-Start Comparison: LLRT also shows a faster warm-start time compared to Node.js, with up to a 2x to 3x speed improvement. Again, below benchmark is captured for the same Lambda Function doing Dynamo PUT Operation.

LLRT Dynamo PUT Lambda Warm-start

Node.js 20 Dynamo PUT Lambda Warm-start

These benchmarks suggest that LLRT offers a promising alternative to NodeJS for specific use cases requiring fast startup times and efficient handling of HTTP requests, especially in cost-sensitive environments due to its lower resource consumption and potentially lower operational costs on AWS Lambda.

For a detailed exploration of the benchmarks and their implications for AWS Lambda users, you can refer to this article on learnaws.io.

AWS SDK V3 with LLRT

LLRT incorporates many AWS SDK clients and utilities, finely tuned for optimal performance. These included SDK clients are engineered for best performance without compromising compatibility, replacing some JavaScript dependencies with native implementations for tasks like Hash calculations and XML parsing. For SDK packages not bundled with LLRT, they should be included with your source code, marking specific packages as external. Here is the list AWS SDK packages bundled within the LLRT Runtime.

Use Cases

AWS Low Latency Runtime (LLRT) enhances serverless applications by offering rapid startup times and efficient execution, critical for performance-sensitive environments. It's ideal for:

Microservices Architecture: Facilitates scalable, flexible cloud applications by enabling individual Lambda functions with quick startup times for each microservice.

Real-Time Data Processing: Essential for platforms like financial trading or live gaming, where LLRT minimizes latency in streaming data processing.

IoT and Edge Computing: Supports environments with strict resource and response time requirements, allowing for immediate data processing from sensors or device control.

API Endpoints: Improves responsiveness of client applications by reducing latency in serverless API requests and responses, enhancing user experience.

Batch Processing: Enables timely execution of time-sensitive batch jobs, such as media processing tasks, by ensuring serverless functions start and complete quickly without significant delays.

Configure Lambda functions to use LLRT

We have 5 ways to configure LLRT as runtime for the lambda function:

Custom runtime (recommended)

Choose Custom Runtime on Amazon Linux 2023 and package the LLRT bootstrap binary together with your JS code.

For most use cases, utilizing LLRT as a custom runtime provides the optimal balance between performance and flexibility. Here's how to set it up:

  • Download the Latest LLRT Release: Download the last LLRT release from here
  • Package Your Function with LLRT: Create a deployment package by including your JavaScript code and the LLRT binary. Ensure that your project structure adheres to the AWS Lambda requirements for custom runtimes.
  • Deploy Your Lambda Function: Use the AWS Management Console, AWS CLI, or AWS SDKs to create a new Lambda function. Specify the runtime as Custom Runtime on Amazon Linux 2023, and upload your deployment package. Set the handler information according to your function’s entry point.

Using a layer

Choose Custom Runtime on Amazon Linux 2023, upload llrt-lambda-arm64.zip or llrt-lambda-x64.zip as a layer and add to your function

Bundle LLRT in a container image

You can package the LLRT runtime into a container image using below Dockerfile

FROM --platform=arm64 busybox
WORKDIR /var/task/
COPY app.mjs ./
ADD https://github.com/awslabs/llrt/releases/latest/download/llrt-container-arm64 /usr/bin/llrt
RUN chmod +x /usr/bin/llrt

ENV LAMBDA_HANDLER "app.handler"

CMD [ "llrt" ]
Enter fullscreen mode Exit fullscreen mode

Using AWS SAM

You can refer example project to set up a lambda
instrumented with a layer containing the llrt runtime using AWS SAM.

Using AWS CDK

You can use cdk-lambda-llrt construct library to deploy LLRT Lambda functions with AWS CDK.

import { LlrtFunction } from "cdk-lambda-llrt";

const handler = new LlrtFunction(this, "Handler", {
  entry: "lambda/index.ts",
});
Enter fullscreen mode Exit fullscreen mode

See Construct Hub and its examples for more details.

Compatibility

LLRT supports ES2020, making it a modern runtime capable of running contemporary JavaScript code.

Note
Although LLRT supports ES2020 it's NOT a drop in replacement for Node.js. Consult Compatibility matrix and API for more details.
All dependencies should be bundled for a browser platform and mark included @aws-sdk packages as external.

Limitations and Considerations

While LLRT offers numerous advantages, it's important to recognize its limitations. LLRT is most effective for smaller serverless functions focused on tasks like data transformation, real-time processing, and AWS service integrations. It's designed to complement existing components rather than serve as a comprehensive replacement for all use cases. Additionally, LLRT's experimental status means it is subject to change and is intended for evaluation purposes.

Conclusion

AWS LLRT represents a significant step forward in serverless computing, offering developers a powerful tool for building high-performance, efficient serverless applications. By addressing the specific needs of serverless environments, LLRT enables faster application startups and lower operational costs, making it an attractive option for developers looking to optimize their serverless applications.

If you liked this blog and found it helpful, do check out my other blogs on AWS:

For those interested in exploring LLRT further, the AWS Labs GitHub repository provides detailed documentation, examples, and resources to get started.

See you until next time. Happy coding!.

References

LLRT GitHub Repository
LLRT Lambda Tuturial
LLRT Benchmarking

Top comments (0)