DEV Community

Cover image for How to connect Apache Kafka using Node Js
Mayur Thosar
Mayur Thosar

Posted on • Updated on

How to connect Apache Kafka using Node Js

In the previous article we studied about Apache Kafka. In this article we would be doing implementation of apache Kafka with the NodeJs

NodeJs is a popular server-side JavaScript runtime that is ideal for building scalable and high-performance applications. It has a wide range of modules and libraries that make it easy to connect to Apache Kafka.

Installing Dependencies

To connect to Kafka using Node.js, we need to install the following dependencies:

kafkajs: A Kafka client library for Node.js.
dotenv: A module for loading environment variables from a .env file.

To install these dependencies, run the following command in your terminal:

npm install kafkajs dotenv

Enter fullscreen mode Exit fullscreen mode

Creating a Kafka Producer

To create a Kafka producer using Node Js, we first need to create a client instance using the Kafkajs library. We also need to define the Kafka broker and topic that we want to produce messages to.

const { Kafka } = require('kafkajs');
require('dotenv').config();

const kafka = new Kafka({
  clientId: process.env.KAFKA_CLIENT_ID,
  brokers: [process.env.KAFKA_BROKER_URL],
});

const producer = kafka.producer();
const topic = process.env.KAFKA_TOPIC;

Enter fullscreen mode Exit fullscreen mode

In the code above, we're creating a Kafka client instance using the Kafka constructor from the kafkajs library. We're also specifying the client ID and broker URL using environment variables loaded from a .env file using the dotenv library.

Next, we're creating a Kafka producer instance using the producer method of the Kafka client. We're also defining the Kafka topic that we want to produce messages to using another environment variable.

To send messages to Kafka, we can use the send method of the Kafka producer. We need to provide an array of messages to the send method, where each message contains a key and value.

const sendMessage = async (key, value) => {
  try {
    await producer.connect();
    await producer.send({
      topic,
      messages: [{ key, value }],
    });
    await producer.disconnect();
    console.log('Message sent successfully');
  } catch (error) {
    console.error(`Error sending message: ${error}`);
  }
};

sendMessage('key', 'Hello Kafka!');

Enter fullscreen mode Exit fullscreen mode

In the code above, we're defining a function sendMessage that takes a key and value as parameters. Inside the function, we're first connecting to the Kafka broker using the connect method of the producer instance.

Next, we're using the send method of the producer to send a message to the Kafka topic. We're passing an array of messages to the send method, where each message contains a key and value.

After sending the message, we're disconnecting from the Kafka broker using the disconnect method of the producer instance. Finally, we're logging a message to the console to indicate that the message was sent successfully.

To test the producer, we can call the sendMessage function with a key and value. This will send a message to the Kafka topic that we defined earlier.

Creating a Kafka Consumer

To consume messages from a Kafka topic using Node.js, we need to create a Kafka consumer instance. We also need to define the Kafka broker, topic, and consumer group that we want to consume messages from.

const { Kafka } = require('kafkajs');
require('dotenv').config();

const kafka = new Kafka({
  clientId: process.env.KAFKA_CLIENT_ID,
  brokers: [process.env.KAFKA_BROKER_URL],
});

const consumer = kafka.consumer({
  groupId: process.env.KAFKA_CONSUMER_GROUP_ID,
});

const topic = process.env.KAFKA_TOPIC;

Enter fullscreen mode Exit fullscreen mode

In the code above, we're creating a Kafka client instance using the Kafka constructor from the kafkajs library. We're also specifying the client ID and broker URL using environment variables loaded from a .env file using the dotenv library.

Next, we're creating a Kafka consumer instance using the consumer method of the Kafka client. We're also defining the Kafka topic and consumer group that we want to consume messages from using environment variables.

To start consuming messages from Kafka, we can use the subscribe method of the consumer instance to subscribe to the Kafka topic. We also need to define a callback function that will be called for each message that is consumed from the topic.

const consumeMessage = async () => {
  try {
    await consumer.connect();
    await consumer.subscribe({ topic });
    await consumer.run({
      eachMessage: async ({ message }) => {
        console.log(`Received message: ${message.value}`);
      },
    });
  } catch (error) {
    console.error(`Error consuming message: ${error}`);
  }
};

consumeMessage();

Enter fullscreen mode Exit fullscreen mode

In the code above, we're defining a function consumeMessage that connects to the Kafka broker, subscribes to the Kafka topic, and starts consuming messages from the topic.

We're using the subscribe method of the consumer instance to subscribe to the Kafka topic that we defined earlier. We're also defining a callback function using the run method of the consumer instance that will be called for each message that is consumed from the topic.

Inside the callback function, we're simply logging the value of the consumed message to the console.

To test the consumer, we can call the consumeMessage function. This will start the consumer and begin consuming messages from the Kafka topic that we defined earlier.

Conclusion

In this article, we've seen how Apache Kafka can be used as an event-driven approach for microservice communication. We've also seen how to create a Kafka producer and consumer using Node.js.

By using Kafka for microservice communication, we can achieve a loosely coupled architecture that is resilient, scalable, and fault-tolerant. Kafka provides a reliable and scalable message queue that can handle high volumes of data and support multiple producers and consumers.

Node.js is an ideal choice for building Kafka producers and consumers, thanks to its event-driven and non-blocking architecture. With the kafkajs library, it's easy to integrate Kafka with Node.js and build highly performant and scalable microservices.

We hope that this article has provided a useful introduction to using Kafka with Node.js for microservice communication. If you're interested in learning more about Kafka, we encourage you to check out the official Kafka documentation and explore some of the many resources available online.

Top comments (0)