DEV Community

Cover image for Deploying Your Static Websites to AWS in Style Using Github Actions
Kyle Galbraith
Kyle Galbraith

Posted on • Originally published at blog.kylegalbraith.com

Deploying Your Static Websites to AWS in Style Using Github Actions

GitHub Actions is gaining popularity for its simplicity and for the fact that a ton of repositories live in GitHub already. With the general availability of Actions, it's now easy to incorporate your CI/CD practices into your repository. Before we use to have to use third-party tools like Travis CI, CircleCI, or other CI/CD providers. But with GitHub Actions, we can keep all our CI/CD processes next to our repository with one simple yaml file.

In this post, we are going to quickly explore how we can continuously deploy our static websites to AWS S3 using GitHub Actions. This post assumes that you are hosting a static website out of AWS S3 and that you have a Git repository for that site living at GitHub. If you don't currently have a static website hosted on AWS S3, consider checking out this post to get you started or check out my Learn AWS By Using It course for a deeper dive.

If you have those two pre-requisites taken care of, let's dive into setting up a CI/CD workflow for our GitHub repository.

AWS Setup Steps

Before we can create our workflow we need to set up a user in our AWS account that will be allowed to deploy our static website to our S3 bucket. To do that, we first need to login to our AWS account and navigate to the IAM console.

IAM Console

Once there we need to create a new user that will have programmatic access to our AWS account. We want to restrict this user to only have S3 access to our account, so we will select the AmazonS3FullAccess permission policy. See the GIF below for the step by step guide.

Create user

On the last page after creating our new user we see the programmatic access keys for them, the access_key and secret_access_key. Copy those to a file somewhere as we are going to add them to GitHub next.

Setting up our GitHub Action Secrets

To deploy to our AWS S3 bucket from our GitHub Action we first need to configure two new secrets in our repository. These secrets are for our AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.

Navigate to the Settings section of your GitHub repository and locate the Secrets section on the left-hand side. Once there we are going to add a new secret for AWS_ACCESS_KEY_ID and paste in the access_key we got in our IAM step. Then we are going to add another secret for AWS_SECRET_ACCESS_KEY and paste in our secret_access_key. In the end, we should have two new secrets in our GitHub repository.

GitHub Secrets

We have our GitHub Secrets configured and our IAM user has access to upload content to our S3 bucket. Now we can configure our Actions to continuously deploy to our bucket on Git pushes.

Setting up Continous Deployment via GitHub Actions

GitHub Actions uses a concept of a workflow to determine what jobs and steps within those jobs to run. To set this up we are first going to create a new directory in our repository that GitHub Actions will watch to know which steps to execute.

From the root of your repository run the following commands:

$ mkdir .github/workflows/
$ touch .github/workflows/main.yml
Enter fullscreen mode Exit fullscreen mode

Inside of our main.yml file we are going to add the following:

name: CI
on: [push]
jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v1
    - name: Configure AWS Credentials
      uses: aws-actions/configure-aws-credentials@v1
      with:
        aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
        aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        aws-region: us-west-2
    - name: Build static site
    - run: yarn install && npm run-script build
    - name: Deploy static site to S3 bucket
      run: aws s3 sync ./dist/ s3://<your-website-bucket> --delete
Enter fullscreen mode Exit fullscreen mode

Here we see three steps defined in our build job. The first one is an AWS provided action that takes our secrets that we configured and sets up our AWS CLI credentials using them. Then depending on how your static website gets built and configured, you want to build the site before uploading it to S3. Our final step is then running aws s3 sync via the AWS CLI to sync our dist folder to our S3 bucket. We use the --delete flag in the CLI call to delete any files that are in the S3 bucket but not in our dist folder.

Now if we commit this new workflow file we should then be able to see in the Actions section of our GitHub repository that the job runs to completion.

GitHub Actions Output

💥🙏 We now have continuous deployment configured for our static website repository living on GitHub but deploying to our S3 bucket.

Conclusion

There are dozens of different ways to continuously deploy our applications and websites nowadays. We have shown but one way of doing it in this post using GitHub Actions and deploying to AWS S3. But that is only one pair of options and there are many more out there like it.

My hope is that in this post you have seen how easy it can be to set up a new GitHub action. This makes setting up a simple continuous deployment setup like this one a breeze. On top of that, if you already host your static site on AWS, it's one small step to push your site to S3 from a GitHub action.

Want to check out my other projects?

I am a huge fan of the DEV community. If you have any questions or want to chat about different ideas relating to refactoring, reach out on Twitter or drop a comment below.

Outside of blogging, I created a Learn AWS By Using It course. In the course, we focus on learning Amazon Web Services by actually using it to host, secure, and deliver static websites. It's a simple problem, with many solutions, but it's perfect for ramping up your understanding of AWS. I recently added two new bonus chapters to the course that focus on Infrastructure as Code and Continuous Deployment.

I also curate my own weekly newsletter. The Learn By Doing newsletter is packed full of awesome cloud, coding, and DevOps articles each week. Sign up to get it in your inbox.

Top comments (0)