This post assumes that the reader has an understanding about Node-Red Flow
Programming tool and wishes to work with a custom Docker Images for it.
This post also assumes the readers are familiar with python programming.
Since Node-Red provides a visual way of connecting many components through Flows,
we might be interested in using it for our own personal stack.
If you haven't tried it out, Node-Red Documentation for Docker provides some nice
ways to getting started.
Since the Flows are rendered to a web-browser, we might want to take into consideration
testing our custom Node-Red container when it might be integrated with other containers
within a stack deployment.
This guide will provide a concise way of joining the dots between creating your custom
Node-Red container and testing its reachability within a
docker-compose file using
Please refer to the GitHub Repository for
the code structure and file content
Diving into the deep-end straight, we want to create a Node-Red custom container with
just the following:
NOTE: for this post sake, we stick to basic things, please feel free to add more things
according to you needs.
Standard practice with Node-Red lets the user add dedicated flows and dependencies via the
npm install command-line. However within the Node-Red-Docker Repo Wiki, you can also
add these flows/dependencies via a dedicated
Let's stick to this method and avoid adding things via CLI within the container.
package.json file and add the following:
"description": "A Slim Node-RED Docker Image running on Alpine Container",
"start": "node $NODE_OPTIONS node_modules/node-red/red.js $FLOWS --userDir=/data"
Moving on to creating a
We will be using a Multi-Stage Docker build process to keep our image a bit light on the
alpine based containers and the official
minimal image from Node-Red.
We will use TWO-Stages in our build process:
- prepare a base Image with
nodered/node-red-minimalimage to install the flows/deps and prepare the
We then use the base Image in step 1 as a production image and copy the
to this production image.
Dockerfile looks like the following:
FROM alpine:3.13 AS base
RUN apk add --no-cache \
npm && \
mkdir -p /usr/src/node-red /data && \
adduser -h /usr/src/node-red -D -H node-red -u 1000 && \
chown -R node-red:node-red /data
FROM nodered/node-red:2.2.2-minimal AS build
COPY package.json .
RUN npm install \
--unsafe-perm --no-update-notifier \
FROM base as prod
COPY --from=build --chown=node-red:node-red /data/ /data/
COPY settings.js /data/settings.js
COPY flows.json /data/flows.json
COPY --from=build --chown=node-red:node-red /usr/src/node-red/ /usr/src/node-red/
CMD ["npm", "start"]
We start with the base image as
alpine:3.13 and prepare it by adding
needing to run Node-Red. For security best-practices, we add a
node-red user and group as
opposed to running everything as
root. This avoids any privilege escalation within the container,
as well as misuse within the container.
build stage will install all the NPM packages in our
package.json and then we copy the
node_modules to our
prod image. In the end, we will run the command as
You can build this image locally using:
docker build -t node-red-slim:latest .
We can check if the flow is working again but building the Docker Image and then try:
curl -XGET http://localhost:1880/health
Don't forget to set the port 1880 when running the docker image e.g.
docker run -p 1880:1880 node-red-slim:latest
If that works, good job! now it is time to make an integration test with python
One cool thing you can try out is the pytest-docker-compose which provides you a nice suite
to do tests with
docker-compose. I won't dive too deep into it, but I relied on this
Dev.To post by Iuliia Volkova for wrting a simple test that does a health check on our
/health API once the container is up and running.
pytest-docker-compose is a plugin for
pytest testing suite. The structure of your test directory
should be as follows:
│ ├── 00-build-test-env.sh
│ └── 01-run-tests.sh
Refer to tests/integration directory for file contents
conftest.py is used to as fixture for the
test-docker-compose.yml to be used for the test stack.
test_fixture.py contains the test fixture to ping the
/health API from our flow once the
For our case, if the
/health API provides a HTTP 200 OK response, our test passes. You can add more
tests depending on your requirements.
We will leverage simple bash scripts in the
scripts directory to setup the python
virtualenvironment as to bring the
test-docker-compose up and run the pytest
You can think of our steps we just went through as a CI/CD Pipeline. The steps are:
- Build our custom Node-Red Docker Image
- Test Integration using
- Push the image to Docker Hub, if test passes
For the repository, I have setup a GitHub Workflow that pushes the docker image only
when I push a Tag e.g.
For this Workflow, you will need to setup the credentials of your Registry in your GitHub Secrets.
For a sample of the YAML file, refer to this deploy.yml
That's how you do custom node-red, docker, docker-compose and integration testing !
I am not aware of conducting tests of the individual tests of the flows generated in Node-Red, but this
test focuses on integrating node-red with some possible stacks where different Databases are part of the
If you have comments, criticisms, feedback please reach out to me and I will be happy to learn and help!