DEV Community πŸ‘©β€πŸ’»πŸ‘¨β€πŸ’»

DEV Community πŸ‘©β€πŸ’»πŸ‘¨β€πŸ’» is a community of 966,904 amazing developers

We're a place where coders share, stay up-to-date and grow their careers.

Create account Log in
ghavasi
ghavasi

Posted on • Updated on

The Cloud Resume Challenge

What is this challenge?

The Cloud Resume Challenge is a multiple-step resume project which helps build and demonstrate skills fundamental to pursuing a career as an AWS Cloud Engineer. The project was published by Forrest Brazeal.

The final result can be seen here and the code can be checked in this GitHub repo.

Topology diagram

How are the sprints structured?

I have joined the August sprint. Sprint take place over 4 weeks.
Each week, we build a mini-project that’s valuable in its own right.


The tasks and the progression


1. Certification

The AWS Cloud Practitioner certificate is a requirement to complete this challenge.
To achieve the certificate, I have used only free materials and even got a 50% discount from the exam fee (normally $100)

2. HTML

The CV is needed to be a HTML document which is styled using CSS. It doesn't need to be an original masterpiece, any HTML CV template will be acceptable. You can find mine here for inspiration.

3. CSS

The same goes to CSS too. You can write your own, or use tailwind or something similar.

4. Static Website

The HTML/CSS website should be deployed to an AWS S3 bucket. I have done it via CloudFormation template. The code is below:

 MyWebsite:
    Type: AWS::S3::Bucket
    Properties:
      AccessControl: PublicRead
      WebsiteConfiguration:
        IndexDocument: index.html
      BucketName: cloud-cv-website
Enter fullscreen mode Exit fullscreen mode

5. HTTPS

In order to be able to use HTTPS instead of HTTP, a CloudFront distribution was needed. I have done it via CloudFormation the following way:

 MyDistribution:
    Type: "AWS::CloudFront::Distribution"
    Properties:
      DistributionConfig:
        # ViewerCertificate:
        #     AcmCertificateArn: !Ref MyCertificate
        #     SslSupportMethod: sni-only
        # Aliases:
        #   www.gabor-havasi.me
        #   gabor-havasi.me
        DefaultCacheBehavior:
          ViewerProtocolPolicy: allow-all
          TargetOriginId: cloud-cv-website.s3-website-eu-west-1.amazonaws.com
          DefaultTTL: 0
          MinTTL: 0
          MaxTTL: 0
          ForwardedValues:
            QueryString: false
        Origins:
          - DomainName: cloud-cv-website.s3-website-eu-west-1.amazonaws.com
            Id: cloud-cv-website.s3-website-eu-west-1.amazonaws.com
            CustomOriginConfig:
              OriginProtocolPolicy: match-viewer
        Enabled: "true"
        DefaultRootObject: index.html
Enter fullscreen mode Exit fullscreen mode

6. DNS

I have used AWS's Route 53 to point a custom domain name (bought on namecheap.com) to the CloudFront distribution which was set up in the previous step. I have used the following code to add the A records:

 myRoute53:
    Type: "AWS::Route53::RecordSetGroup"
    Properties:
      HostedZoneId: Z0787651UFL9JUUWJ3WQ
      RecordSets:
        - Name: gabor-havasi.me
          Type: A
          AliasTarget:
            HostedZoneId: Z2FDTNDATAQYW2
            DNSName: !GetAtt MyDistribution.DomainName

  MyRoute53www:
    Type: "AWS::Route53::RecordSetGroup"
    Properties:
      HostedZoneId: Z0787651UFL9JUUWJ3WQ
      RecordSets:
        - Name: www.gabor-havasi.me
          Type: A
          AliasTarget:
            HostedZoneId: Z2FDTNDATAQYW2
            DNSName: !GetAtt MyDistribution.DomainName
Enter fullscreen mode Exit fullscreen mode

7. Javascript

The CV webpage should include a visitor counter that displays how many people have accessed the site. This script should be written in JavaScript.
For the counter to work, we need to add a few extra line in our HTML website and write a JavaScript code:

The script itself:

const counter = document.querySelector(".counter-number");
async function updateCounter() {
  let response = await fetch(
    "https://ig4fgtqyx2.execute-api.eu-west-1.amazonaws.com/Prod"
  );
  let data = await response.json();
  counter.innerHTML = `You are the ${data}. visitor to my Cloud Resume Challenge site`;
}

updateCounter();
Enter fullscreen mode Exit fullscreen mode

The HTML addition:

<div class="counter-number">Couldn't read the counter</div>
Enter fullscreen mode Exit fullscreen mode

8. Database

The visitor counter will need to retrieve and update its count in a database somewhere. I have used DynamoDB for this task as it was the most convenient solution. The code from the CloudFormation template to create the DynamoDB table:

DynamoDBTable:
    Type: AWS::DynamoDB::Table
    Properties:
      TableName: gabor-havasi-cv
      BillingMode: PAY_PER_REQUEST
      AttributeDefinitions:
        - AttributeName: "ID"
          AttributeType: "S"
      KeySchema:
        - AttributeName: "ID"
          KeyType: "HASH"
Enter fullscreen mode Exit fullscreen mode

9. API

We shouldn't communicate directly with DynamoDB from our Javascript code. Instead, an REAT API should be created that accepts requests from your web app and communicates with the database.
I have created a single GET method and assigned the Lambda to it. I could have separate GET and POST method to read the counter and update it with two separate Lambda functions, but I choose the monolithic approach as it was simpler in our case.

10. Python

The Lambda script to retrieve the counter from the database and update the number should be written in Python. I have done it in Node.js at first, but later rewrote it in Python. The final script is below:

import json
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('gabor-havasi-cv')
def lambda_handler(event, context):
    response = table.get_item(Key={
            'id':'1'
    })
    record_count = response['Item']['counter']
    record_count = str(int(record_count) + 1)
    print(record_count)
    response = table.put_item(Item={
            'id':'1',
            'counter': record_count
    })

    return {

    'statusCode': 200,
        'headers': {
            'Access-Control-Allow-Headers': 'Content-Type',
            'Access-Control-Allow-Origin': '*',
            'Access-Control-Allow-Methods': 'OPTIONS,POST,GET'
        },
        'body': record_count}
Enter fullscreen mode Exit fullscreen mode

And to deploy it via CloudFormation:

Lambdafunction:
    Type: AWS::Serverless::Function
    Properties:
      Policies:
        - DynamoDBCrudPolicy:
            TableName: gabor-havasi-cv
      CodeUri: get-count/
      Handler: app.lambda_handler
      Runtime: python3.9
      Architectures:
        - x86_64
      Events:
        getCount:
          Type: Api
          Properties:
            Path: /
            Method: get
Enter fullscreen mode Exit fullscreen mode

11. Tests

I have added a test, written in Node.js to test the API. This test will be added to GitHub action in the step 14. During the test, it calls the Lambda twice then check if the return value from the second call is 1 more than the return value from the first call.

import { expect } from "@jest/globals";
import fetch from "node-fetch";

async function testCounter() {
  let response = await fetch(
    "https://ig4fgtqyx2.execute-api.eu-west-1.amazonaws.com/Prod"
  );
  let data = await response.json();
  return data;
}

describe("Testing the API", () => {
  it("API test", async () => {
    const firstResponse = await testCounter();
    console.log(firstResponse);
    const secondResponse = await testCounter();
    console.log(secondResponse);

    const expected = 1;
    expect(secondResponse - firstResponse).toEqual(expected);
  });
});
Enter fullscreen mode Exit fullscreen mode

12. Infrastructure as Code

I have used the AWS Serverless Application Model (SAM) template and deployed the different services using the AWS SAM CLI. The code snippets can be found under each step where applicable.

13. Source Control

I have used Git as source control and hosted the code on GitHub from the beginning of this project.

14. CI/CD (Back end)

In this step, we need to set up GitHub Actions such that when we push an update to our Serverless Application Model template or Lambda code, test get run. If the test passes, the SAM application should get packaged and deployed to AWS.
IMPORTANT, do not commit AWS credentials to source control! The access key and the secret key need to be saved as "SECRETS".
For this, I used the following workflow template snippets:

To run the test:

  test:
    runs-on: ubuntu-latest

    strategy:
      matrix:
        node-version: [16.x]
        # See supported Node.js release schedule at https://nodejs.org/en/about/releases/

    steps:
      - uses: actions/checkout@v3
      - name: Use Node.js ${{ matrix.node-version }}
        uses: actions/setup-node@v3
        with:
          node-version: ${{ matrix.node-version }}
          cache: "npm"
      - run: npm install node-fetch
      - run: npm run build --if-present
      - run: npm run test
Enter fullscreen mode Exit fullscreen mode

To deploy to AWS if the test passes:

  deploy-to-AWS:
    needs: test
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: actions/setup-python@v3
      - uses: aws-actions/setup-sam@v2
      - uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: eu-west-1
      - run: sam build
      - run: sam deploy --no-confirm-changeset --no-fail-on-empty-changeset
Enter fullscreen mode Exit fullscreen mode

15. CI/CD (Front end)

Create GitHub Actions such that when you push new website code, the S3 bucket automatically gets updated. The CloudFront cache didn't cause any issue here as I have set the TTL to 0 in step 5.
I have used jakejarvis, s3-sync-action code to sync to the S3 bucket.
The GitHub Action template snippet to achieve this:

deploy-site:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - uses: jakejarvis/s3-sync-action@master
      with:
        args: --delete
      env:
        AWS_S3_BUCKET: cloud-cv-website
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        SOURCE_DIR: CV
Enter fullscreen mode Exit fullscreen mode

16. Blog post

The final step is to write a blog about the experience and what I have learned during the project work. You are reading this blog right now.

16 + 1. What is next?

I am planning to redo this challenge using Terraform.

Top comments (0)

🌚 Life is too short to browse without dark mode