Github is a great way to collaborate on code. However, I use Gitlab as a CI/CD to deploy application to AWS. Gitlab conveniently supports importing external git repositories. This post demonstrates how to deploy a Github repository of a static web application to an AWS s3 bucket using Gitlab CI/CD.
Setup Project
- To connect to an external repository select Menu > Projects > Create new project.
- Select Run CI/CD for external repository, then select GitHub or Repo by URL.
- Add a token to authenticate with Github.
- Add a project name, a project URL, and a description or slug. Click on Create project.
Configure AWS variables
We need to set variables for an AWS deployment. They can be set in .gitlab-ci.yml
or in the Gitlab console. To deploy to a s3 bucket, add the following variables and values. The values are from the s3 console when the bucket is configured as a static website.
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCES_KEY
- AWS_REGION
- S3_BUCKET_NAME_PROD
- S3_BUCKET_NAME_DEV
- CDN_DISTRIBUTION_ID
Configure Gitlab CI/CD
To configure Gitlab, add a file called .gitlab-ci.yml
in the root of the repository. The .gitlab-ci.yml
file defines:
- scripts to build application,
- configuration files and templates,
- dependencies and caches,
- commands to run in sequence or in parallel,
- where to deploy the application,
- running scripts automatically or trigger them manually.
The example below had three stages: a build stage, a review stage, and a deploy stage.
Whether you are deploying a React app or using a static site builder, such as Hugo, the build stage creates the app to be deployed, This example is based on deploying an app built on next.js and script uses yarn to build the app,
The next stage is the review stage which creates an ephemeral deployment used to evaluate the app before deployment. You can use s3 bucket policies to control access during the review stage,
The deploy stage deploys the application to a publicly available s3 bucket configured as a static web site.
stages:
- build
- review
- deploy
variables:
# Common
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_REGION: $AWS_REGION
S3_BUCKET_NAME: $S3_BUCKET_NAME
CDN_DISTRIBUTION_ID: $CDN_DISTRIBUTION_ID
cache:
key: $CI_COMMIT_REF_SLUG
paths:
- node_modules/
Build:
stage: build
image: node:11
script:
- yarn install
- yarn build
- yarn export
artifacts:
paths:
- build/
expire_in: 1 day
Review:
stage: deploy
when: manual
before_script:
- apk add --no-cache curl jq python py-pip
- pip install awscli
- eval $(aws ecr get-login --no-include-email --region $AWS_REGION | sed 's|https://||')
script:
- aws s3 cp build/ s3://$S3_BUCKET_NAME_REVIEW/ --recursive --include "*"
- aws cloudfront create-invalidation --distribution-id $CDN_DISTRIBUTION_ID --paths "/*"
Deploy:
stage: deploy
when: manual
before_script:
- apk add --no-cache curl jq python py-pip
- pip install awscli
- eval $(aws ecr get-login --no-include-email --region $AWS_REGION | sed 's|https://||')
script:
- aws s3 cp build/ s3://$S3_BUCKET_NAME_PROD/ --recursive --include "*"
- aws cloudfront create-invalidation --distribution-id $CDN_DISTRIBUTION_ID --paths "/*"
Summary
Even though your project uses GitHub for managing code contributions,you can use Gitlab or another CI/CD solution to manage builds and deployments.
Top comments (2)
Very helpful
but still i didnt get it how can my gitlab stay updated and trigger a pipeling for every change in the github repo
this might help - buildon.aws/posts/continuous-integ...