In this article we are going to build a MERN app using Lerna Monorepo setup and also we will create a docker image of this complete application for a containerized deployment.
I'm using windows system for this guide._
Here is a breakdown of the process:
- Lerna repo setup_
- Setup workspaces (Frontend and Backend)_
- Creating demo feature (Show users list in a table)_
- Application containerization_
- Run the containerized app in docker environment_
Before starting this guide, I'm assuming that you have some prior knowledge of MERN applications (ReactJS as frontend and ExpressJS as backend) and containerization concept and it's tools like Docker.
Prerequisites
- NodeJS installed
- Docker desktop installed and working
If not, you can get these from here:
Install Docker Desktop
Install NodeJS
If you want to learn more about these concepts in detail, please let me know in comment section, I'll make guide on those separately.
Let's get started...
1. Lerna repo setup
Firstly create a folder/directory with the name of your choice and inside that folder run npx lerna init
command in the terminal.
This command will create a basic setup of a monorepo for JavaScript application.
The folder structure should look like this.
your-project-name
| node_modules
| lerna.json
| .gitignore
| packege.json
| package-lock.json
Now replace the newly created lerna.json
code with the given code
{
"packages": ["packages/*"],
"version": "0.0.0",
"npmClient": "npm",
"useWorkspaces": true
}
And replace the package.json
code with this
{
"name": "root",
"private": true,
"workspaces": [
"packages/*"
],
"dependencies": {},
"devDependencies": {
"lerna": "^8.1.8"
}
}
2. Setup workspaces
As this is going to a MERN application, so, we will be using 2 workspaces here Frontend and Backend.
This is going to be the further division of the application.
packages
- backend
- frontend
i. Backend setup
Inside your root folder run mkdir -p packages/backend
command to create packages folder and backend folder inside it.
Now navigate to the newly created backend folder cd packages/backend
.
Run npm init -y
to create a minimal JS application.
Add some dependencies to the backend application by running npm install express cors node-fetch
and npm install --save-dev nodemon
.
Create a file named index.js
and update the package.json
to run the script.
{
"name": "backend",
"version": "1.0.0",
"main": "index.js",
"scripts": {
"start": "node index.js",
"dev": "nodemon index.js"
},
"dependencies": {
"cors": "^2.8.5",
"express": "^4.18.2",
"node-fetch": "^2.7.0"
},
"devDependencies": {
"nodemon": "^3.1.1"
}
}
ii. Frontend setup
Now run mkdir -p ../frontend
to create new workspace folder and the navigate to it cd ../frontend
.
Now create a React + Vite app using npm create vite@latest . --template
command.
Choose React from the template list.
Next, choose the variant of your choice from the list.
I'm choosing TypeScript + SWC here.
Go back to root folder and run npm install concurrently --save-dev
to run both frontend and backend applications simultaneously.
Now your root package.json
should look like this:
{
"name": "root",
"private": true,
"workspaces": [
"packages/*"
],
"devDependencies": {
"concurrently": "^8.2.2",
"lerna": "^8.1.8"
}
}
Now, create basic API for fetching users list in backend application and put the below code into the index.js file in backend package.
I'm using free jsonplaceholder API here for demo purpose, you can modify your backend API based on your requirements.
const express = require("express");
const cors = require("cors");
const app = express();
const PORT = process.env.PORT || 5000;
app.use(cors());
app.get("/users", async (req, res) => {
try {
import("node-fetch")
.then(async ({ default: fetch }) => {
const data = await fetch(
"https://jsonplaceholder.typicode.com/users"
).then((response) => response.json());
res.send(data);
})
.catch((err) => {
console.log("Error in importing node-fetch: ", err);
});
} catch (error) {
console.log("Error: ", error);
res.status(500).send(error);
}
});
app.listen(PORT, () => console.log("Server running on port " + PORT));
3. Creating demo feature
Let's start with our demo feature of showing users list in a table.
Here in frontend application on the default rendered page (App.tsx file
).
Here I'm using a fully TypeScript supported, dynamic and smart table component with inbuilt Infinite Scroll and Pagination features. You can also try this out.
npm i react-smart-table-component
You can read more about this package here
import { useCallback, useEffect, useState } from "react";
import ReactSmartTableComponent from "react-smart-table-component";
import "./App.css";
interface User {
id: number;
name: string;
username: string;
email: string;
address: Address;
phone: string;
website: string;
company: Company;
}
interface Address {
street: string;
suite: string;
city: string;
zipcode: string;
}
interface Company {
name: string;
catchPhrase: string;
bs: string;
}
function App() {
const [users, setUsers] = useState<User[]>([]);
const [loading, setLoading] = useState(false);
const getUsers = useCallback(async () => {
try {
setLoading(true);
const response = await fetch("http://localhost:5000/users").then((res) =>
res.json()
);
setUsers(response);
setLoading(false);
} catch (error) {
console.log("Error fetching users", error);
setUsers([]);
setLoading(false);
}
}, []);
useEffect(() => {
getUsers();
}, [getUsers]);
return (
<>
<ReactSmartTableComponent
items={users}
search
searchableFields={["name", "email", "phone", "website"]}
searchBoxPlaceholder="Search users"
className="table"
loading={loading}
headings={[
{
fieldName: "name",
title: "Name",
},
{
fieldName: "email",
title: "Email",
},
{
fieldName: "phone",
title: "Phone",
},
{
fieldName: "website",
title: "Website",
},
{
fieldName: "company",
title: "Company",
},
{
fieldName: "address",
title: "City",
},
]}
scopedFields={
{
company: (item) => <td>{item.company.name}</td>,
address: (item) => <td>{item.address.street}</td>,
}
}
/>
</>
);
}
export default App;
To run the both frontend and backend applications, add the run & build
scripts to root package.json file.
{
"name": "root",
"private": true,
"workspaces": [
"packages/*"
],
"scripts": {
"dev": "concurrently \"npm run dev --workspace=backend\" \"npm run dev --workspace=frontend\"",
"build:frontend": "npm run build --workspace=frontend",
"start": "npm start --workspace=backend"
},
"devDependencies": {
"concurrently": "^8.2.2",
"lerna": "^8.1.8"
}
}
By running npm run dev
, you can run both frontend and backend at same time in development environment.
4. Application containerization
My plan is to serve the frontend application through the backend endpoint. To achieve that I have to create a default endpoint in the backend application and put the frontend app's build files to the backend folder's public folder or we can directly use the relative import path ../frontend/dist/
.
In order to achieve this I'm using the first method.
Now backend index.js
file should be like this.
const express = require("express");
const cors = require("cors");
const path = require("path");
const app = express();
const PORT = process.env.PORT || 5000;
app.use(express.static(path.join(__dirname, "public")));
app.use(cors());
app.get("/users", async (req, res) => {
try {
import("node-fetch")
.then(async ({ default: fetch }) => {
const data = await fetch(
"https://jsonplaceholder.typicode.com/users"
).then((response) => response.json());
res.send(data);
})
.catch((err) => {
console.log("Error in importing node-fetch: ", err);
});
} catch (error) {
console.log("Error: ", error);
res.status(500).send(error);
}
});
app.get("*", (_, res) => {
res.sendFile(path.join(__dirname, "public", "index.html"));
});
app.listen(PORT, () => console.log("Server running on port " + PORT));
Here, you might be thinking about the overhead of moving the build files from packages/frontend/dist
to packages/backend/public
, but this will not be an issue when we wrap this application within a Docker container
.
So, now let's start containerizing our application, for this create 2 files.
i. Dockerfile
ii. .dockerignore
Make sure to keep files name exact same, otherwise the won't work.
Put node_modules inside the .dockerignore
file and put below given code to your Dockerfile
.
# Use an official Node.js runtime as a parent image
FROM node:20-alpine
# # Set the working directory
WORKDIR /app
# Copy package.json and package-lock.json
COPY package*.json ./
# Copy lerna.json
COPY lerna.json ./
# Copy the rest of the application code
COPY . .
# Install dependencies
RUN npm install
RUN npm run build:frontend
COPY packages/frontend/dist packages/backend/public
# Expose the backend port
EXPOSE 5000
# Define the command to run the application
CMD ["npm", "start"]
Here, in the Dockerfile, I'm using node:20-alpine
image as the runtime environment for our application.
/app is going to the working directory of our application inside the docker container.
Command COPY packages/frontend/dist packages/backend/public
will do that task for us to copy the app build to the backend's public folder.
EXPOSE 5000
will going to expose our application on port 5000 for the outer world.
CMD ["npm", "start"]
will serve our backend application.
Now, our application is ready to be containerized. To create the docker image run
docker build -t <your-application-name> .
Make sure, your docker desktop is running at this time.
After successful building the docker image, you can see your application image in the docker desktop inside images section with the name you provided while building the image.
5. Run the containerized app in docker environment
We have created the complete application as a docker image, now to deploy our application, we need to run the image within a container.
To do this, run docker run -dp 5000:5000 <your-application-name>
Above command will run the application within a docker container. You can check it in the docker desktop.
-dp
- d stands for detachable and p for port
This flag runs our application in a detachable mode and map the container's 5000 port with our local system's 5000 port.
Now, you can test your application running on http://localhost:5000
.
There might be UI differences as I added some css
for the table, you can add your own styles to that table component, that's fully customizable.
Additionally:
You can put your docker image on the docker registries such as Docker Hub
or AWS ECR (Elastic Container Registry)
and deploy through CI/CD pipelines on the deployment services like AWS ECS (Elastic Container Service)
.
Github Repository of the complete project:
Click Here
So, that's all about this guide.
I hope you have enjoyed reading this article and learnt something from it. If yes, please comment and share this article and let me know what else I can share with you.
Thanks,
Raghvendra Awasthi
Top comments (1)
Nice