DEV Community

Gursimar Singh
Gursimar Singh

Posted on

Serverless Kubernetes on Google Cloud Platform

Kubernetes is gaining a tremendous degree of popularity and is being adopted by all organizations, despite the fact that some are far larger than others. Now, deploying Kubernetes necessitates the recruitment of qualified software developers. It is possible that organizations, particularly bootstrapped start-ups and small-scale ones, would view this as an unnecessary overhead expense. Let’s look at the several ways that this problem may be solved.

Serverless computing: why bother?

The concept of serverless computing has been a topic of considerable interest for some time, and it is not likely to go out of favour any time in the near future. When developers are relieved of the task of managing infrastructure, they are better able to focus their attention on developing and refining the product, which speeds up the process of bringing it to market.

Cloud Run is a serverless platform that is powered by Knative, a runtime environment that extends Kubernetes for serverless applications, and the Functions Framework. Cloud Run was developed by Google. Google is the company that was responsible for developing Cloud Run. Cloud Run enables us to package our already-existing code in a Docker container, in contrast to other serverless services, which require us to deliver code that was specifically created to operate as a function and be activated by events. This is because other serverless services require us to deliver code that was specifically created to operate as a function and be activated by events.

This container is able to work in the fully controlled serverless environment that Cloud Run provides, but because it utilises Knative, it can also run on Google Kubernetes Engine. This enables us to add pay-per-use, on-demand code to your current Kubernetes clusters and gives us the flexibility to do so. Cloud Run is a fully managed environment for serverless computing that you may access over the cloud. Even though it is not always a fully-fledged solution, it does make it possible to run serverless apps on Kubernetes.

In addition to a runtime environment, the open application programming interface (API) that Knative offers is also available. This enables us to run your serverless applications anywhere we see fit, including completely managed on Google Cloud, on Anthos on Google Kubernetes Engine (GKE), or on our very own Kubernetes cluster. Knative makes it easy to get started with Cloud Run, then switch to Cloud Run for Anthos, or to get started with our own Kubernetes cluster and then move to Cloud Run. Both of these options are available. These two courses of action are both up to consideration on our end. Because we are utilising Knative as the basis upon which everything else is built, we are able to migrate workloads between platforms without having to pay large switching charges. This is made possible by the fact that we are using Knative.

Some of the major benefits of Cloud Run:

  • Rapid autoscaling

  • Split traffic

  • Automatic redundancy

  • No vendor lock-in

Cloud Run requires the code to be a stateless HTTP container that creates an HTTP server within four minutes, or two hundred and forty seconds, of receiving a request, replies to the request within the request timeout, and is compiled for 64-bit Linux. It is necessary to set the listening port to 8080, however, the actual port number should not be hardcoded.

Take note that, by default, all Cloud Run services have a consistent HTTPS endpoint with TLS termination taken care of for us.

Some of the use cases include:

· Websites

· REST API backend

· Lightweight data transformation

· Scheduled document generation, such as PDF generation

· Workflow with webhooks

As we have covered the necessary ground to get started with Cloud Run, let’s move forward and see how to deploy a container to Cloud Run

Before getting started, make sure the appropriate Google Cloud Project is activated and billing is enabled for the Google Cloud Project.

Image description

There are multiple ways to deploy to Cloud Run:

  • Deploying a pre-built container

  • Building and deploying a container from source code

We can create both services and jobs using Cloud Run.

Deploying a pre-built container

  1. Go to Cloud Run either from the navigation menu or from the search menu.

Image description

2. Click Create service

o Select Deploy one revision from an existing container image.

o We can either provide our container image URL or click Test with a sample container.

o In the Region pulldown menu, we need to select the region where we want the service to be located.

o Under Authentication, we will select Allow unauthenticated invocations. We can modify permissions as per our use case.

o Finally, click Create to deploy the container image to Cloud Run and wait for the deployment to finish.

Image description

Image description

Note: We can configure CPU allocation and autoscaling as per need. We can also specify the security settings along with referencing secrets.

Conclusion

Even though Kubernetes could appear to be complicated, serverless technologies like Cloud Run might be used to simplify and expedite the process of designing software applications.

Top comments (0)