DEV Community

Cover image for My Cloud Resume Challenge
Jack Goggin
Jack Goggin

Posted on • Updated on

My Cloud Resume Challenge

Context

For a quick background on myself, I have a Bsc in Digital & Technology Solutions and ~2 years experience as an apprentice Software Engineer working on a full stack development with a touch of automated testing and containerization.

So whilst I had the foundation of this challenge with HTML/CSS/JavaScript, databases, servers, Python and more; I didn't have any experience working with Cloud resources.

So I took it upon myself to get an AWS Cloud Practitioner certification and take on Forrest Brazeal's Cloud Resume Challenge. Within a couple months I had built a resume website on the AWS cloud, building a configuring service's along the way.

In a nutshell, the project involved: creating the frontend and then hosting it in S3, Domain registration and DNS configuration, AWS SAM and CloudFormation for building the serverless backend stack; which was comprised of AWS API Gateway + Lambda (Python) + DynamoDB, then finally, continuous integration and deployment using Github Actions. My codebase was stored on Github and I incorporated Unit Tests into my CICD pipeline.

In the interest of not reinventing the wheel - here's what the finished project's architecture looked like:

Image description

ADVICE

A lifesaver on this project, and one not mentioned in the guides, is to leverage CloudWatch logs for Lambda troubleshooting. Paired with the Postman application, this approach streamlined function testing.

Front-end - creating the website

I started by create the base webpage itself, finding an online HTML and CSS template - Jackson template by Aigars Silkalns - which I customized to my needs.

I then created a S3 bucket to store the files that built my front-end (HTML/CSS/JS etc.) in the AWS cloud. I then secured this store with HTTPS by using AWS CloudFront.

AWS Cloudfront is a service that uses distributed caches to strategically place copies of the content worldwide. This reduce latency by delivering content from the nearest cache. It also enhances scalability by offloading the origin server, saves bandwidth and ensure more reliable user experience with high availability. This optimizes load balancing and essentially contributes to an overall responsive, scalable and reliable content delivery system. In addition to the perk of adding HTTPS for extra security.

I then purchased a domain name (jackgoggin.com) via
Amazon Route53, a DNS service used to register and managed domain names. I then used Amazon Certificate Manager to retieve appropriate certificates, SSL/TLS, to encrypt data in transit, ensuring security and to authenticity of my site by registering it with a certificate. This setup ensures secure traffic routing from jackgoggin.com to my CloudFront service, directing to my S3 bucket where the webpage content is retrieved and delivered.

Image description

Useful resources:

Back-end

This involved creating a counter which tracked how many times my webpage was visited.

The steps included:

  1. Creating a database, with DynamoDB (NOSQL database), with a count value which tracked the visitor counter over time
  2. Create an API Gateway endpoint, essentially a REST GET endpoint that when hit, would call a python function created with AWS Lambda, which would update the counter in the databse
  3. Ensure the Lambda function had been assigned an appropriate IAM role which allowed permissions to access and edit the DynamoDB table. This role ensures secure and controlled interactions between services, so nothing gets changed in error.
  4. Enable CORS to allow relaxation of rules on our endpoint. Enabling CORS (Cross-Origin Resource Sharing) is like opening the gates for secure communication between different websites (or web applications). When CORS is enabled, it allows a web page from one domain to request and receive resources, such as data or images, from another domain.
  5. Create a JavaScript function that will send a GET request to the our GET endpoint within AWS API Gateway, to update the visitor counter once a user visits the site.

Image description

Useful resources:

Testing of Lambda functions

At a high-level you can unit test our Lambda function by simply mocking the DynamoDB database with the Python library Moto. Which means you simply create a dummy database within your code and feed this to your Lambda function, which will use that as a database and feed you back an expected JSON response that you can test against. Easy!

Automated tests play a key role in the world of software development, using scripts and tools to execute changes to our codebase and verify their outcomes. This ensure we don't commit any game-breaking issues into production! This frees up valuable time for developers, allowing them to concentrate on refining features and tackling intricate scenarios. This not only speeds up the development lifecycle but also ensures that our code consistently meets high-quality standards.

Source control and CI/CD

GitHub Actions is a continuous integration and continuous delivery (CI/CD) platform that allows you to automate your build, test, and deployment pipeline. You can create workflows that build and test every pull request to your repository, or deploy merged pull requests to production. - GitHub docs

This can be leveraged so that when I make a change to my webpage's core codebase within my IDE and push it to Github, it will automatically update my AWS services as fit. This saves me from having to, for example:

  • Upload file changes to S3 myself
  • Change my lambda functions manually
  • Refresh new services each time to make sure updates are implemented

To implement this we simply setup a directory in our repos '.github/workflows' and create a INSERT_NAME.yml file that will define a series of jobs for GitHub to run when a push is made to the respoitory.

Before doing this, it is required that you setup a role in IAM with access keys that allow GitHub the appropriate permissions to start editing our AWS resources!

Within this workflow YAML file we are simply going to setup this order of actions:

  1. Test our code
  2. If step 1 was successful, build and manage our infrastructure (still need to implement the SAM file for this step). E.g. upload any infrastructure changes or changes to our lambda function.
  3. Sync our front-end files with the S3 bucket. E.g. upload an changes to our website. (Also make sure to invalidate the cache of S3 to ensure any changes are properly uploaded.)

Useful resources:

Infrastructure as Code

Image description

This was the hardest part of the project, for me at least, but also the most rewarding in terms of learning and project value. By implementing IaC - instead of clicking around on the AWS console, I can define all I need in one file and simply execute one command to spin it all up or update it. This obviously improves efficiency but also consistency, agility, reusability, tracking and security.

To achieve this, I used AWS SAM. To use this you will need AWS CLI installed and Docker is another helpful tool - I recommend following Chris' blog linked in the resources below. Then all you need to do is create a 'template.yaml' file in the base of your code and start defining jobs in that file and then execute this with 'sam build --use-container && sam deploy --guided' (once you have setup SAM with AWS CLI). The SAM file should define all the things we have already discussed like your S3 bucket, bucket policy, CloudFront distribution, Route 53 and DNS, Certificates, DynamoDB Table, Lambda Function and API Gateway.

Useful resources:

Summary

This project was worth it's weight in gold by allowing me to cementing the skills I had learn from the Cloud Practitioner examination through real life implementation. I highly recommend it to anyone who wants to learn Cloud Development!

Final product.

Top comments (0)