DEV Community

Alex Eversmeyer
Alex Eversmeyer

Posted on

Design Narrative: Cloud Resume Challenge

This article was originally posted on my self-hosted blog on April 13, 2021.

Getting started

Back in October of last year (2020), I studied for and earned the AWS Solutions Architect Associate certification. The next day, I thought, "Now what?" I did a little poking around online and discovered Forrest Brazeal's Cloud Résumé Challenge. By then, I had missed out on the code review, but I decided that the challenge would be a great exercise anyway, both to gain experience and expose my knowledge gaps. I was right about both!


Coding the frontend of the resume from scratch, given my long hiatus from web development at that point, seemed like an enormous task, and my perspective was that my time would be better spent working on the architecture-related steps. I found a freely-available template and modified it to suit my taste and experience. The template is written in HTML and CSS and proved to be easy to work with. It also includes a snippet to JavaScript to make the dynamic visitor counter function.

Once my draft of the resume was completed, I uploaded it manually to an S3 bucket. I then set up a CloudFront distribution for fast content delivery with a certificate for HTTPS security, registered a domain in Route53, and pointed the resume's subdomain at the CloudFront distribution. So far, so good.

A brand new concept for me was the continuous integration/continuous development workflow. Since my site's version control was being handled by Git and GitHub, it was a matter of finding a GitHub Actions workflow that uploaded my code to S3 and also invalidated the CloudFront distribution every time I pushed an update so that the latest copy of my site would always be visible. I set up some Secrets for my AWS resource names and credentials, added the workflow to my next commit, and watched as the whole thing failed! The issue was with the bucket name I had stored, and once it was entered properly, the workflow succeeded and the frontend was basically complete (other than content editing).


The backend consists of an AWS Lambda function that is triggered by the JavaScript code to communicate with a DynamoDB table via AWS's API Gateway. At this point, I hadn't learned much code at all yet, nor had I worked with any of these services in any appreciable way, so the struggle became real very quickly. With the assistance of a friend with a lot of coding experience, I did attempt to read the Lambda and boto3 documentation and fudge my way through some code. It was soon apparent that I wasn't able to handle the code on my own, however, and I had to start reviewing other people's code to figure out a solution.

I eventually came up with something that I thought might work, so I set up a DynamoDB table, an API Gateway, and the Lambda code. However, I was unable to get everything to talk to each other, largely due to misconfiguration of the API (I think). At this point, I was feeling pretty frustrated with my lack of knowledge and inability to figure these challenges out despite hours of internet searches and reading. I didn't want to outright 'cheat' by simply copying someone's repository, but I was at a loss for what to do, and took a day or two off to let my brain reset.

Refreshed, I decided to look into another requirement of the project: using the Serverless Application Model to provision my backend resources as code. I found a marvelous blog post outlining a similar project, downloaded the AWS SAM CLI, and gave it a whirl. Lo and behold, it was like magic: Amazon did all the work of configuring resources to work with each other, and I had the ability to test my code. It took a few more hours (and some mind-melting frustration) to get my Lambda code to atomically increment a value in my database, but some methodical experimentation - a skill I improved later while formally learning Python over the winter - led to working code.

Another GitHub Action workflow, this time triggering the SAM CLI to build and deploy my code, was set up in the backend repository, and I now had fulfilled (almost) all of the project requirements. One thing I did not get to was unit-testing my Python code. Learning unit-testing is on my shortlist of future projects but hasn't happened yet.

Wrapping up

I finally reached a point where I'm ready to tidy up the content of my resume, as part of a bigger push to prepare myself for the impending job search. This blog post is the other as-yet incomplete part of the project, but now it and my resume are live and ready for prime time! I'd like to think this challenge would have gone quite differently now than six months ago, since I have much more Python experience and more time playing with AWS resources. Nevertheless, it was incredibly instructive at the time and helped set my path forward as I continue learning and progressing towards a new career.

Click here to see my resume!

Frontend GitHub repository

Backend GitHub repository

Discussion (0)