DEV Community

William Lewis
William Lewis

Posted on

The Cloud Resume Challenge: My DevOps Journey from Building Technology to the Cloud

As a Design Technologist for the built environment who has specialized in Building Information Modeling (BIM) for a decade, I’ve developed a passion for improving workflows and streamlining infrastructure. Automation is key to scaling rapidly and maintaining data consistency across a multitude of ongoing architectural design projects. This is often achieved by deploying generative or quality assurance scripts into the toolbar of Autodesk Revit:

3D Model automation tools deployed to design software users
Deploy Automation Tools to Autodesk Revit

Eager to extend my DevOps experience in software for building design, I wanted to showcase the technical skills to deploy code in a typical SDLC. I was in search of a personal project that would incorporate a host of cloud services and also be complex enough to demonstrate my job readiness for cloud positions.

Enter the “Cloud Resume Challenge”, a 16-step outline created by Forrest Brazeal to deploy a static website in your cloud provider of choice, with a focus on integrating and maintaining backend services with key DevOps automation skills. Having recently become a Certified Solutions Architect - Associate, I was excited to expand my hands-on experience in AWS.

The challenge is meant to champion self-learning as it provides only high-level guidance - not instructions - on how to research and implement core topics such as DNS, APIs, Testing, Infrastructure-as-Code, and CI/CD:

Website architecture of site deployed in AWS
Personal Website Architecture

Note that all code for my static resume website william-lewis.com is viewable here in my GitHub repository.


Deploying the Frontend: S3, CloudFront, Certificate Manager, Route 53

After preparing a simple version of my resume in HTML/CSS, I stored these frontend files in an S3 bucket. To enforce an HTTPS connection to the site, I needed to create a CloudFront distribution, linked to my bucket using Origin Access Control (OAC) and with a TLS certificate attached using Amazon Certificate Manager. Lastly, I needed to point my custom domain name to the distribution endpoint using Route 53, creating a DNS hosted zone and records within that (CNAME alias, A, and AAAA types). My site was now accessible on the public web, via HTTPS secure protocol, from a custom domain.

Setting Up a Backend: API Gateway, Lambda, DynamoDB, JavaScript

Next, I needed to create the backend components to support a counter of visitors to my personal site. The data (i.e. visitor count value) is stored in a DynamoDB NoSQL database which is accessed by a Lambda function written in Python3. The function is accessed through a REST API created with API Gateway, which when called will invoke the Lambda function and forward back the direct response due to a “Lambda proxy” configuration. Each time the page is loaded, a short JavaScript script utilizes Fetch API to ping the endpoint of my counter API, before rendering the response in the footer of the page. My site could now fetch and display the latest visitor count, while the Lambda function handled incrementation as it interacted exclusively with the database.

Automating the Deployment: Terraform, GitHub, GitHub Actions, & pytest

To pull the pieces together and automate future maintenance of the site, all related code is stored in a public GitHub repository for version control. Portions of the site infrastructure are provisioned with Terraform, which is configured to store its state file in a remote backend, in a dedicated S3 bucket; this remote backend is what enables Terraform to apply changes when called from my local PC as well as a Linux VM “runner” from GitHub Actions.

I wrote two CI/CD pipelines using GitHub Actions to update the front and back ends of my site respectively. Upon pushing a commit that edits my HTML page (e.g. resume updates), GitHub Actions will use the OpenID Connect (OIDC) standard to gain “short-lived” credentials to my AWS account, then sync the S3 bucket and invalidate the CloudFront cache to ensure immediate access to the latest files.

In the backend pipeline, GitHub Actions similarly triggers a workflow from a filtered event and begins by authenticating with AWS. It then uses the pytest framework to perform unit tests on the latest Lambda code, locally importing my function and using the Moto library to test behavior against a mock database (to avoid unnecessary computing of billable resources). If those tests pass, Terraform is run to build a .ZIP file artifact of the Python code before updating the live Lambda function in AWS. As a final step in the automated workflow, a separate pytest integration test is run to ping the live API endpoint to ensure that it is still available and returning the expected response.


Reflections on the Challenge

  • The most challenging aspect was the surprisingly nonlinear nature of the steps. I would often think I had resolved a problem, only to realize further along that I could have taken a simpler approach or needed to backtrack to modify additional configurations.

  • Multiple steps required more patience than I had anticipated. Instead of honing right in on a solution, I often had to pause the task at hand to invest time improving my understanding of fundamental concepts (e.g. DNS, RESTful APIs, Test Driven Development, etc.). I’m a big believer in “just-in-time learning”, but part of learning on the go means recognizing when you need to zoom out before zooming back in to solve a problem.

  • I am thankful to have completed the challenge as it exposed several knowledge gaps and forced me to overcome them. It was frustrating to chip away at a problem without seeing how all the components would fit together, but that difficulty is what proved most valuable as it solidified my learnings.

  • Lastly, I would encourage Design Technology professionals to begin learning how to write infrastructure-as-code and CI/CD pipelines ASAP! These practices have been slow to permeate the AEC industry compared to generative design, but could be a gamechanger for anyone with programming skills looking to manage digital design infrastructure at scale.

If you liked this post, check out my GitHub profile and connect with me on LinkedIn!

Latest comments (0)