DEV Community

Cover image for Deployment Automation Of A Static Website ⚙️
Charbel El Kahwaji
Charbel El Kahwaji

Posted on

Deployment Automation Of A Static Website ⚙️

Static websites used to be very disregarded (and still are) especially when WEB2 took its place in the tech world. When we talk about static websites, we usually mean HTML, CSS, and JS at the core.
Fortunately for static website users, with the rise of serverless concepts, your static websites are more than enough to run a business securely and efficiently. Read more about it on my other blog.

In fact, in this blog, I will be talking about how to automate the deployment of such websites and what AWS services helped me achieve this.

To elevate the development productivity and keep the architecture simple, we developed our website at Zero&One with a library that compiles chunks of code to basic HTML, CSS, and JS. Sergey is a cool static website generator that helped me refactor redundant code chunks into a file and include this specific file in other files.

So why didn't we use React.js or Next.js? As I said, I want to keep the architecture very simple and don't want to include any additional functionality for a static website. So this eliminates Next.js. React is directly out because I want SEO optimization for this project.

vs code

In this image, I have an _imports folder where I write my components
and then call them anywhere with the <sergey-import /> tag like so.

vs code

I then proceed to run a build command that will build my files in a public folder where the imports will be the actual HTML compiled. You can start a server to preview your changes with the start command. So this public folder is where everything on my website will be, images, scripts, CSS, etc... I can take this folder and manually deploy it to a static web hosting S3 each time I make a change... OR, we can automate this entire procedure!

  1. In your AWS console, go to code pipeline and create a new pipeline.

aws code pipeline step 1

Name your pipeline, choose a role if you already have one for it, or just let aws code pipeline create one for you. Then choose where you want the artifact to be stored.

  1. Click next and choose your source stage. This means what is the event source that will trigger this pipeline. In my case, I used GitHub and its webhooks. Connect GitHub and then your repos and branches will show. Pick the repo and branch where you would like to create a push event. Now each time you commit and push to this branch in this repository, this pipeline will execute automatically.

github as source

  1. Click next. Now comes the fun part, stick with me. In this Build stage, we are going to compile Sergey so it spits out the public folder. Note that the node modules and the public folder generated in your project should be added to .gitignore.

Our build provider is gonna be code build. Choose the region that suits you, then click on create project. A new window will open in CodeBuild. Name and describe your project.
Choose Managed Image and pick Amazon Linux 2.

container

Pick whatever suits you as options. You can explore additional configurations for your container.

In the Buildspec section choose insert build commands and paste the following code. (You can always go by the buildspec file option if you like)

version: 0.2

phases:
  build:
    commands:
       - npm install&&npm run build
artifacts:
  files:
    - 'public/**/*'

Enter fullscreen mode Exit fullscreen mode

What will happen is that the package.json in your project will run and install Sergey, then execute the build command generating the public folder. Once generated, it will create an artifact by traversing recursively everything inside of the public folder. Remember that our node modules and public folder are not in our repo. Only our none compiled HTML files and package.json

  1. Deploy your project. Choose S3 as your Deploy provider and select your region. Pick the bucket that is configured to host a static website. By reaching this stage the pipeline would have transferred the previously generated artifact to be deployed.

Make sure to click extract file before deploying so we can unzip our artifact, otherwise your files will not be visible to the S3.

deploy.

Review your changes and create your pipeline.

If you're here now, this is the architecture that you should have built if you followed the steps correctly.

static deployment automation

You can configure CloudFront and Route 53 if you want to add more action to your project. And that's it! Congratulations on automating the deployment of your static website! :)

Top comments (0)