This article was originally posted on my self-hosted blog on April 13, 2021.
Back in October of last year (2020), I studied for and earned the AWS Solutions Architect Associate certification. The next day, I thought, "Now what?" I did a little poking around online and discovered Forrest Brazeal's Cloud Résumé Challenge. By then, I had missed out on the code review, but I decided that the challenge would be a great exercise anyway, both to gain experience and expose my knowledge gaps. I was right about both!
Once my draft of the resume was completed, I uploaded it manually to an S3 bucket. I then set up a CloudFront distribution for fast content delivery with a certificate for HTTPS security, registered a domain in Route53, and pointed the resume's subdomain at the CloudFront distribution. So far, so good.
A brand new concept for me was the continuous integration/continuous development workflow. Since my site's version control was being handled by Git and GitHub, it was a matter of finding a GitHub Actions workflow that uploaded my code to S3 and also invalidated the CloudFront distribution every time I pushed an update so that the latest copy of my site would always be visible. I set up some Secrets for my AWS resource names and credentials, added the workflow to my next commit, and watched as the whole thing failed! The issue was with the bucket name I had stored, and once it was entered properly, the workflow succeeded and the frontend was basically complete (other than content editing).
I eventually came up with something that I thought might work, so I set up a DynamoDB table, an API Gateway, and the Lambda code. However, I was unable to get everything to talk to each other, largely due to misconfiguration of the API (I think). At this point, I was feeling pretty frustrated with my lack of knowledge and inability to figure these challenges out despite hours of internet searches and reading. I didn't want to outright 'cheat' by simply copying someone's repository, but I was at a loss for what to do, and took a day or two off to let my brain reset.
Refreshed, I decided to look into another requirement of the project: using the Serverless Application Model to provision my backend resources as code. I found a marvelous blog post outlining a similar project, downloaded the AWS SAM CLI, and gave it a whirl. Lo and behold, it was like magic: Amazon did all the work of configuring resources to work with each other, and I had the ability to test my code. It took a few more hours (and some mind-melting frustration) to get my Lambda code to atomically increment a value in my database, but some methodical experimentation - a skill I improved later while formally learning Python over the winter - led to working code.
Another GitHub Action workflow, this time triggering the SAM CLI to build and deploy my code, was set up in the backend repository, and I now had fulfilled (almost) all of the project requirements. One thing I did not get to was unit-testing my Python code. Learning unit-testing is on my shortlist of future projects but hasn't happened yet.
I finally reached a point where I'm ready to tidy up the content of my resume, as part of a bigger push to prepare myself for the impending job search. This blog post is the other as-yet incomplete part of the project, but now it and my resume are live and ready for prime time! I'd like to think this challenge would have gone quite differently now than six months ago, since I have much more Python experience and more time playing with AWS resources. Nevertheless, it was incredibly instructive at the time and helped set my path forward as I continue learning and progressing towards a new career.
Click here to see my resume!