This article is part of a series on sls-mentor, an open-source, highly configurable, automated best-practice audit tool for AWS serverless architectures.
⬇️ I post serverless content very regularly, if you want more ⬇️
Lambda versioning is great and might come in handy, but do you really need to keep dozens of outdated copies of your functions' code ? Here are two reasons why you should get rid of them.
It sure can help to have a backup version of your code to rollback to in case of an emergency, but the day it happens, you will feel a lot more comfortable only having to chose between two or three documented, aliased versions than being overwhelmed by dozens of anonymous and forgotten ones.
AWS Lambda enforces a regional 75 GB limit for all your uploaded packages. This threshold might seem hard to reach and harmless, but it takes into account every version of every Lambda you have deployed: dozens of versions per function multiplied by a large bundle size (see our previous article) will take you to over this threshold in no time.
It is possible to raise this soft limit by contacting the support, but it's not a sustainable workaround, and you don't want to be stuck unable to upload your new version when there's an important fix to deploy.
sls-mentor now offers a new rule preventing your lambdas versions counter from going crazy.
sls-mentor also comes with many other rules to help you make the best decisions for your Serverless project. It will help you identify where your deployed resources can be optimized to achieve better performance at a lower cost.
npx sls-mentor -p <your_aws_profile> -c <your_stack_name>
sls-mentor is available on NPM. You will find instructions to use sls-mentor in your CI.
There also many other tools helping you manage your Lambdas versioning. One of them is Serverless Prune Plugin, which integrates with the Serverless Framework and allows you to only keep a certain amount of recent versions for each of your functions.
To go deeper into AWS Lambda deployment quotas and how to deal with them, check out this article from Yan Cui.