DEV Community

Amalia Hajarani
Amalia Hajarani

Posted on

Lokomotif: Use Case (3: Implementing Node.js)

Prerequisites:

  1. Node version of 16.20.2
  2. MongoDB URI for database access
  3. Zookeeper and Kafka. I'm following this post on how to run them at my windows 10 machine.

And now, let's go into the cooking step:

Initialize Node application

  1. Create a directory dedicated for this project. For me the name is back-end-web-app
  2. Open command prompt from that directory
  3. Run this command:

    npm init
    
  4. Input the package name. Mine is the same as the direcotry name back-end-web-app

  5. Choose version of 1.0.0 by hit enter

  6. Input description as you wish then hit enter

  7. I keep index.js as the entry point so I just hit enter

  8. I have no test command so I just hit enter

  9. I have no git repository dedicated so again, I just hit enter

  10. I have no keywords so I just hit enter

  11. Type your name as the author then hit enter

  12. I have no preferred lisence so I just hit enter

  13. The command prompt will show you somekind of summary of your project, type yes if it is correct already then hit enter

  14. You will see package.json at your directory.

Install npm packages

  1. Installing dotenv. This package will be use to read variables from env file

    npm install dotenv
    
  2. Installing express

    npm install express
    
  3. Installing kafkajs

    npm install kafkajs
    
  4. Installing mongoose

    npm install mongoose
    

Creating .env

This is how my .env lookslike. Don't forget ti change the MONGODB_URI into your own URI and makesure that the port you will use is not being use by another process.

PORT = 3006
HOST = localhost:3006
MONGODB_URI = mongodb+srv://<username>:<password>@cluster0.cluster.mongodb.net/
Enter fullscreen mode Exit fullscreen mode

Creating project directory structure

Below is project directory structure that I use:

*src
    - models
    - utils
*.env
*index.js
*package.json
Enter fullscreen mode Exit fullscreen mode

Creating models

To ease the process I choose to use mongoose. Mongoose take most of the work to communicate with MongoDB. The model that I create in this service is actually the same as I made in previous service.

import mongoose from "mongoose";

const lokomotifSchema = new mongoose.Schema({
    kodeLoko: String,
    namaLoko: String,
    dimensiLoko: String,
    status: String,
    createdDate: String,
});

export const Lokomotif = mongoose.model('Lokomotif', lokomotifSchema);
Enter fullscreen mode Exit fullscreen mode

Make service acts as kafka consumer

I actually not sure where to put configuration of this kafka configuration so I put it inside utils directory.

import { Kafka } from "kafkajs";
import { Lokomotif } from "../models/lokomotifSchema.js";

const kafka = new Kafka({
    clientId: 'lokomotif-data',
    brokers: ['127.0.0.1:9092', '127.0.0.1:9092']
})

const consumer = kafka.consumer({ groupId: 'loko-group' });

export const run = async () => {

    await consumer.connect();
    await consumer.subscribe({ topic: 'lokomotifdata', fromBeginning: true }); // it has to be the same as the topic we've created in previous service

    await consumer.run({
        eachMessage: async ({ topic, partition, message }) => {
            const parsedMessage = JSON.parse(message.value.toString());

            const data = new Lokomotif({
                kodeLoko: parsedMessage.kodeLoko,
                namaLoko: parsedMessage.namaLoko,
                dimensiLoko: parsedMessage.dimensiLoko,
                status: parsedMessage.status,
                createdDate: parsedMessage.createdDate
            })

            try {
                await data.save();
                console.log("THIS IS DATA", data);
            } catch (error) {
                console.log(error);
            }
        }
    })

}
Enter fullscreen mode Exit fullscreen mode

As you may see, I don't do much thing to save the message I got from Kafka to the database since it is being taken care by mongoose

Running the application

  1. Makesure zookeeper and kafka application is run.
  2. Make sure Spring Boot: Scheduler Info service is run.
  3. Open command prompt from the project directory and run this command:

    node index.js
    
  4. If the project runs correctly, you will see this kind of log in your node terminal (command prompt where node is run) and the data being saved at your MongoDB
    Image description

I think that's all for this service. The working code can be find at my github repository.

Top comments (0)