Introduction
In my goal to expand my knowledge of cloud technologies, I undertook the Cloud Resume Challenge. This project involved building a fully functional web-based resume hosted on the cloud, incorporating serverless technologies. This challenge offered a valuable opportunity to explore, gain knowledge, and experiment with serverless innovation, allowing me to focus on key areas such as Infrastructure-as-Code (IaC), CI/CD, and cloud deployment.
Prerequisites
Embarking on my cloud journey, I aimed to acquire a foundational certification from one of the major cloud providers. I chose Azure due to my experience with their Entra ID offering. With the AZ-900 certification under my belt, I was ready to dive headfirst into this challenge. The project was divided into six bite-sized chunks, with each chunk containing various challenges for me to conquer. This post will detail my experiences with the project and the insights I gained from completing each stage.
Front End
The first goal was to convert my resume into an HTML/CSS website and host it on Azure Blob Storage with a custom domain configured through Azure DNS. The website would then use Azure CDN for improved reliability.
For this portion, I utilized an HTML resume template I found online that would fit my needs as I do not have much experience with neither HTML nor CSS and I wanted to focus on the cloud aspect of this project. Additionally, I created a JavaScript file for the front end to make my web page interactive by calling my self-created API.
One step that I would have liked to implement if given more time with this project is to create end-to-end tests with Cypress.
API
This next chunk challenged me quite a bit, as I’ve never written any sort of API that would interact with a storage service. One of the goals was to create a python function app in Azure to talk to a Cosmos DB table and set up the deployment via GitHub Actions. At first, I did not know how to initiate my function app to use POST, GET, or other such requests.
Through research, I learned that I could adjust the route decorator to specify the request types. Here’s how my functions looked after:
#GET request
@app.route(route="readDB", auth_level=func.AuthLevel.ANONYMOUS, methods=['GET'])
def readDB():
logging.info("Received GET request")
#POST request
@app.route(route="updateDB", auth_level=func.AuthLevel.ANONYMOUS, methods=['POST'])
def updateDB():
logging.info("Received POST request")
Once this was completed, I also created tests using Playwright. The tests ensured both GET and POST requests returned a 200 response status, the GET request returned an integer, and the POST request updated the visitor counter.
Infrastructure-as-Code
My language of choice for Infrastructure as Code (IaC) is Terraform because of its cloud-agnostic capabilities. For testing, I deployed my infrastructure with Terraform locally before moving to using Terraform Cloud with GitHub Actions.
If I were to do the project again, I would host my Terraform state in Azure Blob Storage for ease of use. However, creating a destroy plan proved to be challenging. While I couldn't utilize GitHub Actions to send a destroy plan to my Terraform Cloud workspace, I added an Azure CLI section to my GitHub Actions to successfully delete the resource group.
CI/CD
For my CI/CD pipeline I chose to use GitHub actions as I was already using GitHub to store all my source code. I built my YAML files based on this resource from HashiCorp. For this step, I had hosted my front and backend in separate repositories. For my front end, I created a simple pipeline to deploy the Terraform infrastructure and then upload my front end code to my blob storage via Azure CLI.
For my backend, I created three YAML files. The first YAML file was triggered by a pull request on a branch other than the main one. This deployed my test infrastructure, deployed the function app via Azure CLI, and tested the API via Playwright tests.
Additionally, I implemented branch protection rules, requiring all actions to pass before merging with the main branch. The next YAML file was triggered upon closing a pull request. This ensured the deletion of my test infrastructure to avoid incurring charges from a lingering test environment. The final YAML file was triggered upon merging a pull request. This deployed my production infrastructure, the Azure function app, and conducted tests to verify the functionality of the API and frontend working together.
I ran into many errors here, but through countless trial-and-error runs, I completed this portion successfully.
Wrapping Up
Working on this project was a fantastic learning experience. Not only did I get to dive deep into Azure and cloud technologies, but I also got hands-on experience with tools like Terraform, CI/CD pipelines, and end-to-end testing. These are all super valuable skills in today’s tech world that I find fascinating.
This challenge was just the beginning for me, and I'm excited to continue building on what I've learned through this project. Next up, I'm actively preparing for the Azure Administrator Associate certification, focusing on honing my skills in deploying and managing Azure resources.
https://github.com/panduhzz/CRCPanduhz_terraform
https://github.com/panduhzz/CRC_panduhz_tf_backend
https://www.panduhz.com/$web/index.html
Top comments (0)