Serverless Computing is the newest cloud hosting and execution model offered by cloud vendors. It is widely adopted by Cloud Architects to build highly scalable new generation cloud solutions. It is cheap and comes with a lot of useful features that classic cloud hosting models may not offer. Serverless Computing model is used by cloud giants like Microsoft and Amazon to build their in-house products.
Serverless Computing does not mean that no servers are hosting the application code. Application code cannot be hosted or executed without a server. In the case of Serverless Computing, application code always executes on a server. However, the server infrastructure, operating system, hosting, and execution environment is abstracted to you. You build the application code and deploy it to the Serverless environment. You need not worry about the underlying hosting and execution mechanism that is handled by the cloud vendor.
It is highly essential to understand the characteristics of Serverless Computing before adopting it. Serverless Computing exhibits the following characteristics.
- On-Demand execution
- Elastic
- No host
- Distributed
- Shorter execution time
Each of these characteristics has an impact on the Serverless solution that you are developing. You should develop Serverless solutions by taking these characteristics into account.
On-Demand execution
Serverless solutions are event-driven. They execute on-demand when invoked and remain idle until events invoke them. And once invoked, they start doing their task. An array of events can trigger Serverless solutions. Following are few of the popular events that can trigger a Serverless solution:
- HTTP Triggers
- Cloud Storage Queue based events like adding an item to a queue or deleting an item from a queue.
- Databases operations like adding a record or deleting a record. Operations in Modern databases like AWS DynamoDB or -
- Azure Cosmos DB can invoke Serverless Solutions.
- External social media triggers an incoming message to a Twitter handle.
- Another Serverless Solution.
Elastic
Serverless solutions scale on demand. You need not configure either manual scaling or automatic scaling while building Serverless solutions. 8nderlying cloud platform managed by the cloud vendor does the scaling. The cloud vendor makes sure that all necessary infrastructure and servers are in place to support scaling. And the solution can scale out automatically during peak hours and scale in when there is less load. However, dynamically scaling is not that simple as it appears. Every cloud vendor has a limit to the number of instances that the solution can scale. While designing a Serverless solution, you should take into account the maximum number of instances that the Serverless service scales up. Else that would result in a bottleneck and can bring down the performance of the Serverless solution. Usually, cloud vendors provide multiple pricing tiers for the Serverless services based on the number of instances they can scale. In a nutshell, scaling cannot be controlled by you, and you depend on the underlying platform for the cloud solution to scale.
No host
No host does not mean that you do not deploy Serverless solutions to a hosting environment in the server. In the case of Serverless solutions, the cloud vendor abstracts the underlying server and the hosting environment from you. You have no control over the underlying infrastructure. You host the code in the Serverless environment, and the cloud vendor takes care of the underlying hosting infrastructure managed by the cloud vendor.
You should design the Serverless applications very carefully because you have no control to tune the hosting environment based on the application requirements. You have to be very sure that the underlying infrastructure runs the application as-is without having a necessity to modify the underlying hosting environment. Else your application may not be the right candidate for Server hosting.
Distributed
Serverless solutions are distributed by design. They strictly follow the Single Responsibility pattern. Each of the Serverless solution components should perform the task it needs to. For example, one of the Serverless components reads the data from the database. Another component massages the data and pushes it to a queue. A third component processes the data in the queue. Serverless solutions are clean and support complex design patterns like Microservices architecture.
Data consistency for the Serverless application can be a concern like any other distributed systems. When one Serverless component is writing to the database, then the Serverless component performing the read operation should wait for the write operation to complete. Else there can be read errors.
Shorter execution time
Application code hosted on Serverless services must complete execution in a short span. They should not keep on executing for a long time. Application code on a Serverless service executes whenever an event invokes the service. The execution must complete quickly.
The cloud vendors impose the execution restriction at the platform level. They set a time limit beyond which the application code execution times out. The application code must complete execution before the time limit set by the cloud vendor.
Cloud vendors offer a wide range of pricing tiers based on the maximum code execution time. You should use the appropriate pricing tier that is best suited for the application.
Hope this was helpful.
Top comments (1)
That is just Super! Just add a cover image and the post will be ready to bang๐