A while ago I wrote how you can host your own single page application on S3. But how will you get your application on the S3 bucket? There are a couple of options here, you could upload it by hand? But we both know that is not the real solution here. No we want to automate this process! In this blog post I will show you how you can automate this using AWS CodePipeline.
The Pipeline
AWS Codepipeline uses different stages, I often use Source
, Build
and Deploy
stages. In some cases I split the Deploy stage into a Development, Testing, Acceptance and Production deployment (also known as DTAP). If you want to know more about how you can set this up you can read my building applications with pipelines blog. But in the end it is up to you and what makes sense to your use-case.
When you deploy your infrastructure using CloudFormation. You can make use of the outputs within the CodePipeline. Another option is to use a naming convention. I like to use the outputs as it removes the need to define a name upfront. Making it more robust when you re-use snippets or deploy your infrastructure more than once.
Outputs:
ApplicationBucketName:
Value: !Ref ApplicationBucket
The next thing you need to define is a namespace on the action that deploys your infrastructure.
- Name: ExecuteChangeSet
Region: eu-west-1
RunOrder: 2
RoleArn: !Sub arn:aws:iam::${DevelopmentAccountId}:role/cross-account-role
Namespace: DevelopmentVariables
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: "1"
Configuration:
ActionMode: CHANGE_SET_EXECUTE
RoleArn: !Sub arn:aws:iam::${DevelopmentAccountId}:role/cloudformation-execution-role
StackName: !Sub ${ProjectName}-development
ChangeSetName: !Sub ${ProjectName}-development-ChangeSet
By default CodePipeline will load the outputs in the given namespace. In this example that is DevelopmentVariables
, so the ApplicationBucketName
is available as: #{DevelopmentVariables.ApplicationBucketName}
.
Deploy to S3
AWS provides a S3 Deploy action you can use this action to deploy an artifact in your pipeline to S3. You can create this artifact in a CodeBuild Project or you can use the source artifact.
I am using a cross account deployment strategy. For this reason I need to allow my cross-account-role to allow uploads to the S3 buckets. I am using a BucketPolicy for this:
- Sid: AllowPipelineToUpload
Effect: Allow
Action: s3:PutObject
Principal:
AWS: !Sub arn:aws:iam::${AWS::AccountId}:role/cross-account-role
Resource: !Sub ${ApplicationBucket.Arn}/*
Note that the role and the bucket are living in the same account. The pipeline lives in my build/deployment account. So in the pipeline we need to configure the upload to S3:
- Name: Ireland-Uploadapplication
Region: eu-west-1
RunOrder: 3
RoleArn: !Sub arn:aws:iam::${DevelopmentAccountId}:role/cross-account-role
InputArtifacts:
- Name: application
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: S3
Version: "1"
Configuration:
BucketName: "#{DevelopmentVariables.ApplicationBucketName}"
Extract: true
CacheControl: max-age=0, no-cache, no-store, must-revalidate
In this example I will use my artifact called application and extract the content in the S3 bucket. It will assume the role that we specify as RoleArn
to perform the upload. I will also set the CacheControl
so that CloudFront knows that it needs to serve the new content.
Conclusion
It is easy to use the S3 Deploy action to upload your content to a S3 bucket. It removes the need of using a CodeBuild project to upload the content. This will reduce cost and complexity, by not maintaining an extra CodeBuild project.
Top comments (0)