DEV Community

Cover image for Getting Started With Kafka Using Docker
Brandon Charest
Brandon Charest

Posted on

Getting Started With Kafka Using Docker

Getting Started

Before we do anything let's make sure we have Docker installed.

Docker Desktop

Docker Desktop is a simple and easy way to manage containers.

What is Kafka?

Kafka is an open source streaming platform that was developed by LinkedIn written in Java and Scala and is now under the Apache foundation.

In simple terms Kafka is a message broker, which helps transmit messages from one system to another in realtime, now I understand that this is a major simplification as to exactly what Kafka is and does but for now this definition will work. To note, there are other products that achieve similar results, some popular ones are Redis and RabbitMQ, each with there own pros and cons that I wont go into here.

We will also be using a service called Zookeeper which is used to manage Kafka cluster, track the node status and maintain a list of topics and messages and manage a bunch of other metadata.

It is possible to use Kafka without Zookeeper by using Kafka Raft Metadata Mode (KRaft). Which is available in experimental mode in Kafka 2.8 and should be ready for production by Kafka 3.3. But for this we are just going to stick with using Zookeeper

Installing Kafka Using Docker

Now that we have Docker and understand a little bit about Kafka we can start.

I like to create a separate folder for all my projects so for this I will call it 'kafka-docker'.

We will be using the following containers.

bitnami/kafka
bitnami/zookeeper
Enter fullscreen mode Exit fullscreen mode

There are other images available for Kafka and Zookeeper and you can use which ever you prefer, I have just chosen these two.

Instead of setting this up manually we are going to use Compose. If you have not used Docker Compose before essentially its a convenient way to work with multiple containers at once using a single YAML config file.

Inside the folder you created, create a YAML file with the following information. By default docker-compose looks for a file called 'docker-config' but you can name it whatever you want.

docker-config.yml

version: "3"
services:
  zookeeper:
    image: 'bitnami/zookeeper:latest'
    ports:
      - '2181:2181'
    environment:
      - ALLOW_ANONYMOUS_LOGIN=yes
  kafka:
    image: 'bitnami/kafka:latest'
    ports:
      - '9092:9092'
    environment:
      - KAFKA_BROKER_ID=1
      - KAFKA_CFG_LISTENERS=PLAINTEXT://:9092
      - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://127.0.0.1:9092
      - KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper:2181
      - ALLOW_PLAINTEXT_LISTENER=yes
    depends_on:
      - zookeeper
Enter fullscreen mode Exit fullscreen mode

We do not need anything too special for this config file so we can take this for the documentation found here.

If ports 9092 or 2181 are not available for your system feel free to change them to ports that you have open and available.

Once that file is created its time to use docker-compose to setup our containers for us.

Inside the same folder that has your YAML config file run the following.

Format:
docker-compose -f <nameOfYourConfig.yml> up -d

docker-compose -f docker-compose.yml up -d
Enter fullscreen mode Exit fullscreen mode

Command Breakdown:

  • docker-compose: Utility used to download/run the containers we defined in out config file
  • -f: Specify an alternate compose file (default: docker-compose.yml)
  • docker-compose.yml: Our config file
  • up: Create and start the containers specified in the config file
  • -d: Run in detached mode. Containers will run in the background

Once finished let's check that everything is actually running.

docker ps
Enter fullscreen mode Exit fullscreen mode

Command Breakdown:

  • ps: List all containers

If all went smoothly you should see something in your terminal similar to below.

containersRunning

We did it! We are now up and running with Kafka using Docker!

Connecting to Kafka Shell

With Kafka up and running we can create a connection to it.

Format:
docker exec -it --user <userName> <nameOfContainer> bash

docker exec -it --user root kafka-docker_kafka_1 bash
Enter fullscreen mode Exit fullscreen mode

Command Breakdown:

  • exec: Run a command in a running container
  • -it:
    • -i: Interactive mode
    • -t: Pseudo Terminal
  • --user: Username of user id
  • root: Name of the user we are going to login as
  • kafka-docker_kafka_1: Our container name we want to run the commands on
  • bash: Open interactive Bash session

If all goes well we should have opened a new Bash session side our Kafka container!
insideKafka

That's A Wrap

At this point you have a fully functioning Kafka instance running inside Docker.

I like to try and keep posts relatively short so I will save going over a little more about what Kafka is, creating topics, and creating some microservices to can interact with each other using Kafka for the next blog post, I am hoping for this to be a small series so be sure to check back!

Latest comments (0)