DEV Community

Ben Watts
Ben Watts

Posted on

AWS Serverless Menu - Small to Big

Whenever I package and build a new micro service to AWS I classify it in one of the following:

AWS Lambda - Zip

This is my go-to for the simplest workloads, and the service I rely on 80% of the time.

If I can, I'll use typescript, and then bundle with web pack, which deals neatly with external dependencies and minifies the function so it is really small. For dev, I'll avoid minifying with webpack to simplify debugging in AWS.

If I want to do something more data heavy or complex, I'll sometimes instead go for Python. Having a mixture of languages is one of the big advantages of micro services. Packaging dependencies is a bit tricker for Python, for simple packages I'll usually include in my build script something to copy the relevant directories from my venv's site-packages. However one neat alternative is to use a publicly available precompiled lambda layer from Klayers. These slot neatly in one line of code into my terraform config and enable me to use popular Python packages like numpy and pandas without troubling myself about C extensions, build environment or having to mess with Amazon Linux.

Amazon Linux - Fargate

This is my next step up. I rarely need it for Typescript, but for Python is often lets me package services that exceed the 50MB limit and instead have the black box of a Docker container. It also allows me to use basic Ubuntu, rather than Amazon Linux.

Spin up time (often 5-10 secs) means I don't generally recommend this for public facing APIs, so instead I'll feed it from an SQS queue.

When deploying with Terraform, you have to create a separate ECR repository for your image, and then manually update the code of the Lambda function with an API call.

Along with standard Lambda, this also suffers a 15 minute runtime limit.

Fargate

When all else fails, I'll opt for a Fargate task. This is my workhorse for standard machine learning (if something is really big or needs GPU, I'll go for Sagemaker, but warning it's much less flexible!)

As with above, this gives the blank canvas of a Docker container, making local development and testing that bit easier. Crucially there's no 15 minute limit, and a configurable choice of CPU/memory limits that can handle all but the biggest ML type workloads.

There are some decent Terraform modules out there that package your Fargate either as a task or a service.

If relevant, I can then call such a task on demand from within one of my simpler lambda functions using the boto3 (python) or AWS-SDK (typescript) packages, including a task number environment variable (which the Fargate container can then rely on to fetch any state required from S3/Dynamo or wherever).

Conclusion

Overall, I'll try to fit user the higher categories as much as possible, and refactor to accommodate this wherever possible. So if have two large Python packages which together exceed 50MB, but not individually, I might think of deploying two Lambda functions and handing state from one to the other.

Top comments (0)