DEV Community

Cover image for Hosting My Resume in the Cloud with AWS Serverless: A Network Engineer's Perspective
Joseph Hazen
Joseph Hazen

Posted on

Hosting My Resume in the Cloud with AWS Serverless: A Network Engineer's Perspective


I started learning about cloud and using AWS intermittently a few years ago in 2019 while completing some coursework for university. I did not do much of a deep dive into cloud, but recently after passing the AWS SAA-C03, I have been thinking of switching over from networking.

I have been working as a network engineer for the past four years (primarily with Cisco), and before that I was doing work with physical cabling (fiber optic, copper, waveguide, etc). I found that the transition between cables and networking was very smooth, as it was easy for me to visualize the physical topology of the network while designing them. When I decided to take the plunge into cloud last year however, I had my doubts about how far my knowledge of traditional networking would get me while studying for the AWS Solutions Architect Associate.

While studying for this exam, I eventually found that there is there a rich and complex suite of features with AWS VPC, there are also many options to connect to the cloud from an external network using hybrid deployments. I now think that people with traditional networking skills have a very unique position when transitioning to using cloud services, as general networking skills are something that a lot of people with no experience struggle with the most.

After passing the AWS SAA, I was ready to dive into some projects to get some more hands-on experience besides labs, and I thought what better way than create a personal website to host my resume completely with AWS services.

On to the Project...

For the concept, I followed along with Forrest Brazeal's Cloud Resume Challenge book, which breaks the project into a few different chunks listed below. I set up the development pipeline first however, which is why they are out of order.

Chunk 4: CI/CD

I chose to set up Terraform and Github Actions first, as I already had some experience using it and I wanted to frontload some of the learning process by already implementing infrastructure-as-code. My goal at this point in the project was to securely sync my GitHub repository with all of the code I would write for my website to the S3 Bucket which CloudFront would pull its cache from, and to write all of my infrastructure using Terraform. I first set up SSO using IAM identity center and created an SSO profile using the AWS CLI. The most challenging part however was learning how to properly set permissions on the S3 bucket for my GitHub Actions workflow to push code to, and ensuring that the bucket policy was as restrictive as it could possibly be without breaking the pipeline.
CI/CD Pipeline

Chunk 1: Front End (HTML, CSS, CloudFront)

This chunk is without a doubt the easiest, so I won't spend too much time on it. I am not a designer so I made the page very simple and minimal, but there is a plethora of free HTML templates out there on the web to use. I also created a CloudFront distribution using the SSL certificate I generated before as well as the R53 hosted zone for my domain here.
My Resume

Chunk 2: Back End (Lambda, DynamoDB, API Gateway)

Now for the difficult part. This project would not be much of a showcase of AWS services without having some kind of back end integration, so we must implement a visitor counter to display on the resume. We want it to be serverless as well, which is why we are using Lambda, DynamoDB and API Gateway. I created a very simple table in DynamoDB, but struggled when it came time to update and read data from it, so I had to spend a lot of time reading the documentation regarding hash/range keys as well as attributes and data types. I also had to read up on the boto3 library for python which integrates AWS services, specifically the syntax for reading and updating data in my DynamoDB table. Once I was able to get my code to work in the lambda test console, I created my API Gateway with the activation link to be used in the next chunk.
Returned Value from Lambda

Chunk 3: Integration (JavaScript, CORS)

After spending so much time on trying to get the application to work, at this point I felt like I had successfully completed the hardest section of the project, and that all I had to do after writing a couple lines of JavaScript to call the API and proudly display the counter on my resume, I would be finished. How wrong I was. After creating my .js file in the directory and adding the script into my html file, I looked into the console and saw the dreaded CORS error. I probably spent most of my day trying to get this to work, reading up about CORS, watching YouTube videos, messing around with the API Gateway CORS configuration, all to no avail.

It is at this point where I took a break and didn't come back to the project until the next day after some reflection. I tried to visualize every step of my application much like a network topology that isn't communicating. I had looked in CloudWatch Logs and saw that the table was incrementing and returning the value with a 200 status code. My API Gateway was configured properly for CORS, but why was the header not being applied? I remembered that I had configured my API Gateway as a proxy for Lambda, which is when things started to click. I read up on AWS SAM from the documentation, and I started to think of it in terms of how LAN networks function. The routing process works by using encapsulation, so the router encapsulates the packets with headers to get it to the right destination. In the AWS serverless application model, the compute functions take somewhat of a similar role, while they are not making any path decisions, the Lambda function must add the CORS headers, because like layer 2, the API Gateway only acts as a proxy between the client and the application, passing whatever the application puts in the body of the response. After I realized this and had the Lambda function apply the CORS headers, everything was functioning perfectly.
Functioning Counter in Browser

Chunk 5: Blog

Now that the project is finished, and we have reached the end of the blog post as well, here are a few reflections and things I've learned along the way:

  • Learning Terraform made things a lot easier in the long-run and I am glad I did it first.
  • I learned a lot more about programming in this project than I ever did just reading documentation or doing labs. Having a set end-goal with a project forces you to learn exactly what you need to and is a lot better than uncertain goals like "learn python" or "learn javascript".

Finally, I would like to say that this project was really fun, and I am really excited for my next one.

Top comments (0)