Summary
This blog describes the steps required to add a continuous integration and continuous delivery (CI/CD) pipeline to an existing bucket in Amazon Simple Storage Service (Amazon S3) on the Amazon Web Services (AWS) Cloud.
This blog uses GitHub as a source provider. The pipeline is initiated when new items are committed, and the changes are then reflected in the S3 bucket.
The Blog will cover the creation of an AWS Account to Automatically deploy the static website from Github to S3 using AWS CodePipeline.
Explanation about Tools
AWS CodePipeline – A continuous delivery service you can use to model, visualize, and automate the steps required to release your software. You can quickly model and configure the different stages of a software release process.
CodePipeline automates the steps required to release your software changes continuously.Amazon S3 – A highly scalable object storage service. It can be used for a wide range of storage solutions, including websites, mobile applications, backups, and data lakes.
Prerequisites and limitations
Prerequisites
- An active AWS account
- Knowledge of Amazon S3 and AWS CodePipeline
- A static website
- A GitHub repository
Limitations
- This process is recommended for displaying read-only content. It isn't recommended for collecting or transferring sensitive information, because Amazon S3 uses the HTTP protocol.
- Websites built using PHP, JSP, or APS.NET are not supported, because Amazon S3 doesn't support server-side scripts.
Architecture:
Step 1: AWS Account Creation
Create an AWS Account by signing up from here: aws.amazon.com
Step 2: Account Sign Up
After Successful Signup, Login to the Console
Step 3: S3 Bucket Creation
Go to the Amazon S3 bucket and create a new S3 Bucket for hosting/uploading content. S3 is a global service hence unique bucket name is required.
Enter a Unique DNS-Compliant name, globally unique because the namespace is shared by all AWS Accounts.
Choose either the default AWS Region or select a specific Region for where your bucket will be based.
Set Permissions by allowing public access to your bucket by choosing Permissions and then choosing Edit. Choose Off for Block all public access. By default, this check box is selected for security purposes. Choose Save and review the information before choosing to Create a bucket. This closes the pop-up and creates the bucket.
Step 4: Configuration of S3 Bucket for Static Website Hosting
The next step is configuring the Bucket for static website hosting.
In the S3 Console, select by clicking the bucket you just created.
In the Properties tab for the S3 bucket, choose Static website hosting by clicking edit.
Select "Enable" from Static Website Hosting.
Select "Host a static website" from the Hosting type.
Specify file names and extensions for the home page and error page (for example, index.html and error.html). Make sure that the root folder contains these files and that they serve as landing pages.
Click Save Changes.
Step 5: Adding the Bucket Policy
Create a bucket policy so that other AWS applications can access and perform actions on your bucket. In the Permissions tab, choose Bucket policy.
In the Bucket Policy editor, paste the bucket policy provided the here:
Make sure to Replace "arn:aws:s3:::example.com/" with your BUCKET ARN. Make sure /* is at the end of your Bucket ARN in commas
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::example.com/*"
]
}
]
}
Click Save to apply the Policy
Step 6: Creating a GITHUB Repository
Create a repository and push your initial object or add the file to it. (e.g. HTML file etc) to the repository.
Remember which branch you are using, for simplicity, I will use the main branch.
Step 7: Creating the Pipeline
Sign in to the AWS Management Console and open the AWS CodePipeline console. Choose Create pipeline.
Give the pipeline a useful name such as website-delpoy-s3
Select New service role such as: AWSCodePipelineServiceRole-us-east-1-website-delpoy-s3
Artifact store: Choose the Default location option
Click Next to proceed to the next segment
Step 8: Adding the Source Stage
Select Github Version 2 from the source provider.
Click Connect to Github to proceed Next.
Write the name of the connection. Click Connect.
Select the apps if available or click install a new app and connect to the Github.
After a Successful connection,
Select the repository name, the branch(I am using the main one), keeping other options default, click next.
Step 9: Build Phase
Skip the Build phase. You can use AWS Codebuild to compile typescript or any project that needs to build before deploying. We skip this because the repo contains static website content.
Step 10: Deploy Stage
Select the Deploy provider: Select Amazon S3
Bucket: Select the bucket configured for the static website.
Extract file before deploying: You must check this because the code pipeline compresses the artifact.
No additional configurations are needed. Hit the Next button.
You can go back and change the configuration if you made any mistake at the Review step. Hit the Create Pipeline button.
If your pipeline was created successfully, you will receive two green ticks on both Source and Deploy.
Time to test if your pipeline works.
Push a file to GitHub and CodePipeline should pick it up and deploy those additions/updates and deploy that on the S3 bucket, automatically.
This pipeline will cost only $1 per month and charges only if a deployment happened.
Cleanup: Delete the pipeline you just created.
Follow Sumama Zaeem on LinkedIn
Top comments (6)
Nice blog post. Usind AWS CDK for IaC would make it easier. There are already some AWS CDK S3 static website blog posts out there :). And it is also possible to use CodePipeline with AWS CDK to update it.
Thank you for your feedback. I appreciate it. This is my first AWS Blog.
Excellent @sumamazaeem .
Let me perform it by following the steps.
I have a workshop too on this on June 4, 2022.
Hi Zaeem,
This tutorial is a perfect cake for my current need. Final step of pointing a domain to S3 bucket would be a cherry on a cake.
Thanks and Cheers.
Nice blog. Keep it up