DEV Community

Cover image for Running Kafka Locally like a Pro
tkssharma
tkssharma

Posted on

Running Kafka Locally like a Pro

Running Kafka Locally: A Step-by-Step Guide

I am writing this because on the web there is no proper way to bootstrap kakfa locally and i spent few hours to figure out an issue !! frustrating right !

'mad'

Introduction

Kafka is a powerful distributed streaming platform that is widely used for real-time data processing and messaging. To get started with Kafka without the need for a cloud-based environment, you can run it locally on your development machine. This guide will walk you through the process of setting up a local Kafka cluster using Docker Compose.

Prerequisites

  • Docker installed on your system

Steps:

  1. Create a Docker Compose File: Create a file named docker-compose.yml with the following content:
version: '3'
services:
  zookeeper:
    image: wurstmeister/zookeeper:latest
    ports:
      - "2181:2181"
  kafka:
    image: wurstmeister/kafka:2.11-1.1.1
    ports:
      - "9092:9092"
    links:
      - zookeeper
    environment:
      KAFKA_ADVERTISED_HOST_NAME: ${HOST_IP}
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_AUTO_CREATE_TOPICS_ENABLE: 'true'
      KAFKA_DELETE_TOPIC_ENABLE: 'true'
      KAFKA_CREATE_TOPICS: "topic-test:1:1"
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
Enter fullscreen mode Exit fullscreen mode

KAFKA_ADVERTISED_HOST_NAME: ${HOST_IP}

This is very Important line while setting up kafka localy, this will allow local system to use IP and connect to kafka container, so get your IP quickly ! we can HOST_IP while booting up docker compose

  1. Start the Kafka Cluster: Run the following command in your terminal:
  export HOST_IP=$(ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1)
docker-compose up

Enter fullscreen mode Exit fullscreen mode

This will start the Zookeeper and Kafka containers in the background.

Using Kafka

Once Kafka is running, you can produce and consume messages using a Kafka client library. Here's an example using the kafkajs library:

const { Kafka } = require('kafkajs')
const ip = require('ip')

const host = process.env.HOST_IP || ip.address()
console.log(host)

const kafka = new Kafka({
  clientId: 'my-app',
  brokers: [`${host}:9092`],
})

const producer = kafka.producer()
const consumer = kafka.consumer({ groupId: 'test-group' })

const run = async () => {
  // Producing
  await producer.connect()
  await producer.send({
    topic: 'topic-test',
    messages: [
      { value: 'Hello KafkaJS user!' },
    ],
  })

  // Consuming
  await consumer.connect()
  await consumer.subscribe({ topic: 'test-topic', fromBeginning: true })

  await consumer.run({
    eachMessage: async ({ topic, partition, message }) => {
      console.log({
        partition,
        offset: message.offset,
        value: message.value.toString(),
      })
    },
  })
}

run().catch(console.error)
Enter fullscreen mode Exit fullscreen mode

Additional Tips:

  • Kafka Connect: Use Kafka Connect to integrate with other data sources and sinks.
  • Schema Registry: Implement a schema registry to ensure data consistency and compatibility.
  • Security: Configure security measures like authentication and authorization for your Kafka cluster.

By following these steps, you can set up a local Kafka cluster and start experimenting with its capabilities.

Top comments (0)