DEV Community ๐Ÿ‘ฉโ€๐Ÿ’ป๐Ÿ‘จโ€๐Ÿ’ป

Steve Bjorg for LambdaSharp

Posted on

Baseline Performance for AWS Lambda .NET using Top-Level Statements

.NET 6 introduced top-level statements, which simplify the entry point of the application code. Unlike the previous style of Lambda definitions, this project creates an executable instead of an assembly. That means we have to provide our own Lambda host implementation. Fortunately, AWS already provides one in the Amazon.Lambda.RuntimeSupport package.

Minimal Top-Level Lambda Function

The MinimalTopLevel project defined a Lambda function that takes a stream and returns an empty response. It has no business logic and only includes required libraries. There is also no deserialization of a payload. This is the Lambda function using top-level statements with the least amount of overhead.

using Amazon.Lambda.Core;
using Amazon.Lambda.RuntimeSupport;
using Amazon.Lambda.Serialization.SystemTextJson;

await LambdaBootstrapBuilder.Create(Handler, new DefaultLambdaJsonSerializer())
    .Build()
    .RunAsync();

Task Handler(Stream request, ILambdaContext context)
    => Task.CompletedTask;
Enter fullscreen mode Exit fullscreen mode

Benchmark Data for .NET 6 on x86-64

Again, we see that the duration of the INIT phase is not impacted until we exceed the 3,008 MB threshold, which also drives up cost.

However, compared to the Minimal baseline project, this Lambda function has 20% to 100% longer cold start durations.

Memory Size Init Cold Used Total Cold Start Total Warm Used (100) Cost (ยต$)
128MB 235.420 1,484.898 1,720.318 367.174 24.05849
256MB 236.617 744.375 980.992 151.328 23.93210
512MB 236.002 353.587 589.589 120.821 24.15341
1024MB 238.403 163.425 401.828 115.392 24.84696
1769MB 234.304 96.894 331.198 115.843 26.32520
5120MB 216.870 92.632 309.502 117.367 37.69996

Image description

Fullsize Image

Image description

Fullsize Image

Minimum Cold Start Duration

Fortunately, the situation improves quite a bit once we look at the optimal configuration for minimum cold start duration. Using top-level statements shift some of the overhead from the INIT phase to the first INVOKE phase, but otherwise the total duration is very close to what was measured for the Minimal baseline project.

Architecture Memory Size Tiered Ready2Run PreJIT Init Cold Used Total Cold Start
arm64 5120MB yes yes no 178.743 70.947 249.690
x86_64 5120MB yes no no 190.382 67.500 257.882
x86_64 1769MB yes yes no 189.152 58.625 247.777
x86_64 5120MB yes yes no 176.368 57.039 233.407

Image description

Fullsize Image

Minimum Execution Cost

Again, the ARM64 architecture is the most cost-effective approach. ReadyToRun, Tiered Compilation, and the PreJIT settings all contribute to reduce cost a bit further for the minimal top-level project. That said, the minimum execution cost is ~8.5% higher when using top-level statements. This increased cost is most likely due to the higher memory configuration, which is required to compensate for the increased overhead of the INIT and first INVOKE phases.

Architecture Memory Size Tiered Ready2Run PreJIT Init Cold Used Total Warm Used (100) Cost (ยต$)
arm64 1024MB no yes no 234.381 168.534 122.581 24.08155
arm64 1024MB no yes yes 252.593 167.558 124.272 24.09109
arm64 1024MB yes yes no 202.952 124.406 152.314 23.88962
arm64 1024MB yes yes yes 218.905 118.831 152.727 23.82079

Image description

Fullsize Image

What's Next

It's been fun diving into the fundamentals of AWS Lambda for .NET functions. The post is summarizing my findings and thoughts on future projects that might be interesting to explore.

Top comments (0)

Timeless DEV post...

How to write a kickass README

Arguably the single most important piece of documentation for any open source project is the README. A good README not only informs people what the project does and who it is for but also how they use and contribute to it.

If you write a README without sufficient explanation of what your project does or how people can use it then it pretty much defeats the purpose of being open source as other developers are less likely to engage with or contribute towards it.