DEV Community

Nur Akmal
Nur Akmal

Posted on • Updated on

GCP Cloud Resume Challenge πŸš€

A Cloud Guru has started a challenge for those aspiring Cloud Engineers/Architects. This challenge is ideal as a next step for learners that have completed GCP fundamental videos on coursera and done several hands-on labs on qwiklab. It simulates a real world project where you are given a set of tasks to deliver before a deadline.

By completing this challenge you will be able to build a serverless application using GCP with API backend and GitOps-based CI/CD. Throughout this project we prioritize the use of GCP equivalent features such as using Cloud Build Trigger instead of Github Actions because we want to be familiar with all the nitty gritty bits of GCP ecosystem.

"Serverless" refers to building & deploying your web applications without managing any infrastructure. Examples of Google's serverless offerings are Cloud Run, Cloud Function and App Engine.

Some advantages of serverless application is that it is fully managed and supports autoscaling out of the box. All we need to focus on is building your application. However with this level of deployment abstraction you are dependent on the cloud provider and may risk vendor lock-in. By using GCP we are able to minimize this risk as it adopts β€œopen Cloud” philosophy and one of their must have features is a no-lock-in approach. If you develop a workflow on Google Cloud Platform (GCP), it should be easily portable to other environments.

Run-Time Architecture

Alt Text

Static files are served from Cloud Storage which are publicly available. Publicly readable objects are cached in the Cloud Storage network by default which guarantees better performance. For this challenge we opt to use Cloud CDN for caching which supports custom domains over HTTPS and Cache invalidation. Here is a detailed comparison between the two caching methods.

Cloud CDN requires Cloud Load Balancing (CLB) to connect to in order to serve content and provide most of the configuration. We can select a multi-region bucket, which automatically replicates objects across multiple Google Cloud regions. This can improve the availability of your content and improve failure tolerance across your application.

Next is to reserve a static ip address for our CLB and at the same time point our domain to Google's nameserver via zones in Cloud DNS. Now we have a functional data flow from our frontend bucket to our user's browser.

Within our static website we embedded a javascript script which tracks the total number of visitors. We developed our backend API using flask and gunicorn to retrieve and update the total visitors in our Cloud Firestore using data sharding technique. We configure our firestore to only allow authenticated users which in this case is our backend API. We also enable CORS to only allow origins from our domain , this provides another layer of security to prevent bots from abusing our API. We then containerized our backend as a docker image and deploy it as a serverless application on Cloud Run.

Whenever someone visits our website, the javascript will make a call to our backend API which will immediately spin up our backend application through Cloud Run.

These on-demand instances bring down the running cost of web applications to the minimum as they are charging only for the time used to run the code. We now have a readily scalable, high performance, secured and sustainable website on GCP.

Dev-Time Architecture

Alt Text

For the DevOps part of this challenge , we followed GitOps-style approach . We have 3 different Repositories ( Frontend , Backend , Infra ) on github which we then mirror it to Cloud Source Repository. When we push to the frontend repository, it will upload the latest static files to our bucket and invalidate Cloud CDN. When we push to our backend repository, it will trigger a Cloud Build , build a container image, push it to Artifact Registry, redeploy Cloud Run using the latest image from the Artifact registry and invalidate Cloud CDN.

Cloud Build is GCP Serverless CI/CD platform. Any changes in the repository result in new instances of Cloud Build being spun up and it will build based on the config YAML file that we have provided in our repository.

We make use of Infrastructure as Code (Terraform) to continuously deploy our serverless application with the latest image of our back as soon as we push our update to the repository.

Conclusion πŸŽ‰

I noticed it took several seconds for the total visitors count to be displayed on the website, maybe we can explore refactoring our backend API using Cloud Function instead of deploying it on Cloud Run.

I can even try to finish up the "Download PDF" button with another backend service that converts my resume to a PDF.

For this project I have chosen to use Terraform to continuously deploy our backend image to Cloud Run because I have done some hands-on practice before embarking on this project, alternatively this can also be done using Google Cloud Deployment Manager.

Next I am excited to reattempt this challenge using AWS and Azure, I felt that anyone who has completed this challenge would have proven themselves to have the fundamental skills to work on that particular cloud provider platform.

πŸ‘‰ Github:
Frontend
Backend
Infra
Completed Challenge

Extra πŸŽ‰

Alt Text

I made an attempt to create an IaaC using terraform for this challenge which is stored in infra repository. Ideally it should be able to provision all the necessary resources/roles that are required to run both frontend and backend , however I am facing with some problems on automating some of the resources (Load Balancer , SSL Certificate etc).

Currently the IaaC in infra repository will provision the following on a new GCP Project:

  • Setup Terraform to use a remote GCS backend
  • Create bucket to store terraform state
  • Create artifact registry & repository
  • Create Firestore and App Engine
  • Setup Firestore rules
  • Create and Populate Firestore database
  • Enable all necessary APIs

πŸ‘‰ Usage after creating a new GCP project and enabling billing.

  • Enable Cloud Build API, Cloud Source Repositories API
  • Connect all 3 github repository to CSR
  • Create Cloud Build Trigger for each repository
  • Set Cloud Build service account to Owner (For learning convenience otherwise principle of least privilege should be enforced)
  • Trigger Infra Repo first followed by Frontend and Backend

The following task/resources are not included in the IaaC because it takes some time for them to be up and running

  • Reserve Static IP Address , Create Zone with custom domain in Cloud DNS
  • Create Google managed SSL Certificate
  • Create Load Balancer and Connect Backend service to our Cloud run, Backend Bucket to our Static files Bucket

Top comments (1)

Collapse
 
gnarlylasagna profile image
Evan Dolatowski

This is great, I had a great time completing the Cloud Resume Challenge using Azure! Thank you for sharing your experience