Serverless was a big implementation for organizations when it came out. The entire goal (depending on who you spoke with) was that it would minimize and/or remove the need for underlying infrastructure. Developers thought they would be able to write code and just run it without having to do any configuration.
Unfortunately, that did not pan out, but a lot of good still came out of it.
In this blog post, you’ll learn what serverless is, what GCP’s version of serverless is, and how to configure it.
What Is Serverless
Here’s a high-level explanation of Serverless:
Write code that you want to execute based on a certain event, the code is set up on a trigger for when that event occurs, and the code runs.
This sounds great in theory and a lot of engineers loved the idea that they could write code and run it, but it turns out that Serverless was better suited for event-drive workloads. Event-driven workloads can mean many things, but in this case, it’s code that gets kicked off based on a particular event that occurs in your environment. It could also be a script that runs.
For example, let’s say you have to query particular workloads every few seconds. The workloads could be retrieving real-time data for an implementation and you have to ensure that the frontend is constantly updated with the new data. Cloud Run can run a job that continuously executes code or a container to run the code to competition, and the completion would be to fetch the data.
The goal with Serverless is to run code and/or containers based on a particular event that takes place.
What Is GCP Cloud Run
There are many Serverless providers, but some of them are taking an extra step forward to act as more of an orchestrator compared to a simple event-based trigger workload.
GCP is taking this leap.
Cloud Run gives you the ability to run both Jobs (code/container that executes until competition) and Services, which are long-running workloads. The interesting thing about Cloud Run Services is that from a containerization perspective, it can almost act like a serverless orchestrator, which can be compared to other platforms like EKS Fargate profiles or even GKE Autopilot.
💡 GKE Autopilot is “Serverless Kubernetes”. You no longer have to manage Worker Nodes if you use GKE Autopilot. It’s a great implementation, but there are some things that you’re limited to with GKE Autopilot because the entirety of the infrastructure is abstracted away from you.
Where Cloud Run shines is if you want to manage multiple containers separately (no sidecar container support) to run application stacks or jobs. It’s great if your application stack is fully decoupled, but it may cause a headache if you need a few parts of your application stack in one location. If you’re already running containers and/or have created a container image and ensured that your application stack works in a containerized environment, you should have no trouble.
Configuring GCP Cloud Run With Code
Now that you know a bit about serverless, let’s learn how to configure GCP Cloud Run with a custom Container Image.
- In the GCP console, search for Cloud Run.
- Within the Cloud Run panel, click the blue + CREATE SERVICE button.
- The service will require a container image. You can use an example container image or the container image below.
💡 I wrote the adminturneddevops/golangwebapi
container image below. It’s a simple web based Go (golang) API that’s great for testing.
adminturneddevops/golangwebapi
- Configure the region, authentication and CPU allocation for the container image.
- Set autoscaling and ingress capabilities.
💡 This is where a tool like Cloud Run shines. It’s combining serverless with the ability to implement resource optimization for the containers running.
- Under the Containers, Volumes, Networking, and Security section, you can customize your container with ports, container name, volumes, networking, security, and any other customization option that’s available including resource optimization for CPU and memory within the containers.
- When finished, click the blue CREATE button.
💡 A cool feature when using cloud-based platforms like GCP is the ability to use the integrated third-party tools and services that are available within the cloud. For example, within Cloud Run, you can connect to a SQL server running on GCP.
- After a few moments, you should see the container running.
GCP Cloud Run on The Terminal
Aside from running Cloud Run in the portal, you can also use the gcloud
CLI, which is the terminal/programmatic method of interacting with GCP.
To deploy a container in Cloud Run via gcloud
, you can use the run
command.
Below is an example.
gcloud run deploy gowebapi --image adminturneddevops/golangwebapi
After you run the command above, you’ll have a few prompts to answer.
You should now see the container getting deployed.
After the deployment is complete, you’ll see the container show up in the Cloud Run portal.
Top comments (3)
Thanks for your content
Thanks for the content. Serverless is more relevant now than ever in my personal opinion. It has matured beyond simple FaaS frameworks and unloads a lot of potential for developers, process engineers, data analysts, and now even data scientists to focus on their work and not on infrastructure.
I honestly think we as a society may begin to outgrow the term "serverless" and see something different. This is largely because people often believe that Serverless = FaaS but that's just not true anymore.
Yeah, I agree with you. It took me a bit to wrap my head around the "new age serverless" stuff as well because I've been thinking about Serverless as only something that can run event-driven applications... but that's not the case anymore.