DEV Community

andre aliaman
andre aliaman

Posted on • Updated on

Practical Approach deploy nextJS with S3

Recently I have a chance to deploy a react app using Next.js, which is part of the JS components. If you are new with nextjs, You can refer to this link to find out what it is.

You can generate all assets such as css, js, etc automatically with Next.js and then, try to upload it to S3(If you are new with AWS S3, refer to this link) and serve it via CloudFront(If you are new with AWS Cloudfront, refer to this link). With this S3+Cloudfront, we can fully serve our assets from AWS resources instead of using our service resources.

I started to do a research online and found a snippet at github that can be so much help for this project. Here is the link to the snippet

So things that we need to prepare before doing this project are, have a proper IAM access (If you are new with AWS IAM, refer to this link) for accessing S3 and CloudFront, and then create certificate for CloudFront/S3 since we're going to keep our assets inside those services.

On my first try, I follow the article blindly and found out that every time I deploy new services, all the assets that has been generated from nextjs will always have different hash than the assets which nextjs tried to access when we run the application. It's because we tried to generate the assets from application inside our CI/CD and then we try to build our application again inside Dockerfile, which will make it double in progress for generating assets in one application build process.

Then I realized the problem, I tried to update and modified the Dockerfile for accommodating our needs which we want to upload the assets shortly after we build the application.

After a couple of 'playing', This is my Dockerfile looks like:

FROM node:10.16.0 as build
ENV AWS_ACCESS_KEY_ID=give-your-AWS access KEY
ENV AWS_SECRET_ACCESS_KEY=give-your-AWS-secret-key
#it will made awscli we generated, can be access in node10
ENV PATH=/root/.local/bin:$PATH
RUN apt-get update && apt-get install -y python3-pip && pip3 install awscli --upgrade
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY ./package.json /app
# To handle 'not get uid/gid'
RUN npm config set unsafe-perm true
RUN npm install --silent
RUN npm install react-scripts@3.0.1 -g --silent
COPY ./app
RUN npm run build
RUN npm run export
#it can make us running in multiple environment
RUN mv out/_next .
RUN mv out _out
RUN mkdir -p out/builds/Our-BUild-Hash
RUN mv _out/* out/builds/Our-BUild-Hash
RUN rm -rf _out
RUN mv _next out/
RUN aws s3 cp ./out/_next s3://name-of-yours3bucket/test/_next --cache-control public --acl public-read --recursive
RUN aws s3 cp .//static s3://name-of-yours3bucket/test/static --cache-control public --acl public-read --recursive
RUN aws s3 cp ./out/builds s3://name-of-yours3bucket/test/builds --acl public-read --recursive
RUN aws s3 sync s3://name-of-yours3bucket/test/builds/$BITBUCKET_COMMIT s3://name-of-yours3bucket/test/current --delete --acl public-read

FROM node:10.16.0
COPY --from=0 /app /app
WORKDIR /app
EXPOSE 3000
CMD npm run start
Enter fullscreen mode Exit fullscreen mode

There are a couple of new things here, in docker node 10.16.0. We can still install AWS CLI similar to the Ubuntu install process (In another article, I found almost a similar process but with node 9).

Other things we can do with multiple environments, is that we can make the process in the same bucket but on different folder.

Top comments (0)