As part of my journey through learning AWS and getting certified, I came across Cloud Resume Challenge floated online (Linked/Reddit) by Forrest Brazeal (Cloud Architect, AWS Serverless Hero). Details of this challenge can be found here.
I know I am too late to join this challenge (last date set by Forrest Brazeal was 31-July-2020 to get the code review) due to my preparation towards AWS certifications, but I still wanted to take it up and give it a try. I joined the Discord channel run by Forrest, to see how others have completed this challenge and take some help from that wonderful community. I would like to thank Chris Nagy, as I referred his blog and git hub repos, whenever I needed help.
Being completely new to AWS world, I achieved AWS Certified Cloud Practitioner Certification in Feb 2019. Then I went on to achieve AWS Certified Solutions Architect Associate (March 2020), AWS Certified SysOps Administrator (July 2020), AWS Certified Developer Associate (Oct 2020), Currently, I am preparing for AWS Certified Solution Architect Professional and would like to ace asap.
Based on the knowledge I acquired during preparation of AWS certifications, hosting a static website on S3 and using CloudFront for content distribution was easy. I purchased a domain name using AWS Route 53 and configured it to use the CloudFront distribution. Also, I made use of AWS Certification Manager(ACM) to procure an SSL certificate for the site.
Backend infrastructure and logic was needed to update and retrieve the visitor count from a database table. This involved use of AWS resources like API Gateway, Lambda and DynamoDB.
DynamoDB: DynamoDB is AWS's fast and flexible NoSQL Database service offering for any scale. I created a simple DynamoDB table with one item to store and update the visitor count. Atomic Counter feature of DynamoDB comes in handy here. Atomic Counter is a numeric attribute that is incremented, unconditionally, without interfering with other write requests. With an atomic counter, the updates are not idempotent. In other words, the numeric value increments each time you call UpdateItem operation. This operation is implemented in the Lambda function (more details below).
Lambda: AWS Lambda is a compute service that lets you run code without provisioning or managing servers. I created a python based Lambda function, which queries the DynamoDB table and updates the visitor count item. As mentioned earlier, I utilised the
update_itemoperation on the DynamoDB table to increment the numeric value. Since, I don't have Python scripting experience, I referred to the Lambda function code from git hub repos of Chris Nagy and Bansi Mendapara.
I initially created back-end components like DynamoDB, Lambda and API Gateway separately using AWS Console and configured them to work together to update and provide the visitor count for the front end html. But the requirement was to make use of AWS SAM (Serverless Application Model) template to define these back end resources as Infrastructure as Code (IaC) and deploy them using SAM commands. I did have basic understating of SAM while preparing for AWS Certified Developer certification. Again, this blog post from Chris Nagy helped me to better understand use of SAM for serverless applications build and deployment. Next step, as an improvement, I want to create and deploy even the front-end sources (S3 bucket, CloudFront Distribution, Route53 records configuration) as a SAM template (IaC).
Here is designer view of the CloudFormation template built and deployed by SAM:
Another requirement is to use GitHub repositories to store the front-end / back-end code and make use of GitHub Actions to achieve continuous integration and deployment, or CI/CD. I never used GitHub Actions earlier, so it was new learning for me. GitHub Actions simplifies and automates many steps in creating/updating and deployment of the resources. For front-end CI/CD, I used GitHub Actions to configure AWS credentials, deploy the changes to S3 bucket (which stores html/css/js/images content) and then invalidating the CloudFront distribution. GitHub secrets were used to securely store environment variables like AWS Login Keys.
A separate GitHub repo was created to store the back-end code, which included the Lambda function, SAM Template and Python tests. Corresponding GitHub action is created to configure AWS credentials, run python tests, SAM build and SAM deploy commands. Again, GitHub secrets were used to securely store environment variables like AWS Login Keys.
Here is the final (not really!!) version of my one page resume: https://www.chandraym.com/