There are two primary cloud-native environments that you see in today’s world:
- Kubernetes/containers
- Serverless
End users want to focus more on the application that is deployed rather than the infrastructure powering it.
With Argonaut, you’ll find that both of the cloud-native environments are easy to set up, configure, and deploy.
In this blog post, you’ll learn about:
- What serverless is
- The stack setup by Argonaut’s AWS Lambda based app deployments
Refer to our other blog for the setup of Kubernetes on AWS.
What is Serverless?
Before starting with Argonaut, you may be wondering What is serverless?
Since computers started to become mainstream (and even before that), there’s always been a standard workflow.
Create infrastructure → App is being created while or before the infrastructure is created → Deploy the app to the infrastructure.
A critical side effect of this workflow is the infrastructure management overhead.
According to Cloudflare, “Serverless computing is a method of providing backend services on an as-used basis. Servers are still used, but a company that gets backend services from a serverless vendor is charged based on usage, not a fixed amount of bandwidth or number of servers.”
Of course, there is a server behind the scenes managed by the serverless vendor. You just don’t see it or interact with it.
Popular Serverless Services
Although this article focuses on AWS Lambda, there are a few other serverless services that you should think about depending on the type of environment you’re in:
- Application runtimes: Azure Functions, Azure Web Apps, AWS Elastic Beanstalk, Google Cloud Functions, Cloudflare workers, etc.
- Managed services: AWS Lightsail (databases), MongoDB Atlas Serverless, etc.
All of these services, in one way or another, are doing the same thing - they’re abstracting out the infrastructure so engineers can focus on the functionality.
Argonaut Implementation
There are three primary pieces of how Argonaut works with AWS Lambda. These pieces are customized to ensure a fully featured developer experience.
The first is the SAM framework.
SAM Framework
The AWS Serverless Application Model (SAM) is an open-source framework to build serverless apps.
The great thing about the SAM framework is that it’s much easier to get a serverless application up and running with it.
If you’re wondering how Argonaut is building the Lambda function itself once you pass in the code via the browser, it’s using the SAM framework.
The example code below shows what a SAM template looks like. It starts off with specifying the template information along with the resources, handler, and application itself. The important piece is the handler and runtime. The handler tells SAM which application it should be running in the current repository and the runtime tells SAM which language the application is written in.
AWSTemplateFormatVersion : '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
Handler: index.handler
Runtime: nodejs14.x
SAM with Argonaut is not only used for the Lambda Function itself, but it also deploys the API gateway.
API Gateway
An API gateway is a management tool that allows you to manage the connection between a client and backend services.
The reason that Argonaut uses an API gateway is so the Lambda function can provide an HTTPS endpoint for the Lambda function. Argonaut uses that HTTPS endpoint to route HTTPS requests to the Lambda function. This also allows secure access to the API with authentication and authorization.
TLS (HTTPS) certificate management is automatically done by Argonaut when custom domains are specified.
Internal Proxy
The internal proxy is custom code written to ensure that code (from frameworks like Next.js, Django, etc.) can directly work with AWS Lambda without making any changes.
It’s a super lightweight proxy that enables code to work as a Lambda without user modifications.
When to use AWS Lambda
Lambdas are great as they provide an extremely quick way to get started on the cloud while being indefinitely scalable. The pricing is also very friendly for early-stage startups as it is usage-based and zero fixed costs. Serving just a few requests per hour is free.
However, there are disadvantages. Firstly, if there are multiple services that need to communicate with each other, the Lambda paradigm would be inefficient, and addressing other services is extremely cumbersome. Beyond 2-3 services is practically intractable for management.
In such cases, a Kubernetes-based deployment would be preferable. Argonaut makes Kubernetes as easy to deal with as Lambdas.
Wrapping Up
This is a post in a series of explainers of how Argonaut works under the hood to demystify and provide transparency. We also provide the pros and cons so that you can make the right decision for your use case. Check out how you can get started with deploying your first application to AWS Lambda in less than 5 minutes at https://ship.argonaut.dev
Thanks to Michael Levan for putting this article together.
Top comments (0)