Photo credit:Jukan Tateisi
In this post, I will show you how to run an Apache Kafka cluster on your machine and how to:
- create a topic that will keep the data
- produce messages to topic and
- consume those records
using only the CLI tools that come with Apache Kafka. All this in 5 minutes.
Ok, are you ready? Let’s get started!
To be able to produce messages to and consume them from Kafka we need… uhm, the Kafka cluster. At a minimum, a Kafka cluster consists of one Kafka server (called broker) and it needs at least one Zookeeper node.
To simplify our job, we will run these two servers using docker-compose.
Don’t have docker-compose? Check: how to install docker-compose
I’ve prepared a docker-compose file which you can grab from Coding Harbour’s GitHub:
git clone https://github.com/codingharbour/kafka-docker-compose.git
Once you have the project, navigate to a folder called single-node-kafka and start the Kafka cluster:
docker-compose up -d
The output should look something like this:
Creating network "single-node-kafka_default" with the default driver Creating sn-zookeeper ... done Creating sn-kafka ... done
Your local Kafka cluster is now ready to be used. By running docker-compose ps , as shown below, we can see that Kafka broker is available on port 9092.
$ docker-compose ps Name Command State Ports ------------------------------------------------------------------------------- sn-kafka /etc/confluent/docker/run Up 0.0.0.0:9092->9092/tcp sn-zookeeper /etc/confluent/docker/run Up 2181/tcp, 2888/tcp, 3888/tcp
Now that we have our cluster up and running, we’ll create a topic.
A topic is a way to organize messages. A producer is always sending a message to a particular topic and consumers are always reading messages from a particular topic.
To create a topic we’ll use a Kafka CLI tool called kafka-topics , that comes with Kafka. In our case, it means the tool is available in the docker container named sn-kafka.
First, open your favourite terminal and connect to the running Kafka container:
docker exec -it sn-kafka /bin/bash
Now that we’re inside the container where we have access to Kafka CLI tools, let’s create our topic, called first_topic:
kafka-topics --bootstrap-server localhost:9092 \ --create --topic first_topic \ --partitions 1 \ --replication-factor 1
Check that the topic is crated by listing all the topics:
kafka-topics --bootstrap-server localhost:9092 --list
The output should resemble the one below:
Next, let’s produce a message to a Kafka topic we just created. For this, we will use a Kafka command-line tool called kafka-console-producer.
In the same terminal window run kafka-console-producer:
kafka-console-producer --broker-list localhost:9092 --topic first_topic >
Producer tool will display the prompt, showing us that it is waiting for the message to send. Type the message and send it by pressing Enter.
Congratulations, you have successfully produced the message to the topic called first_topic. Now it’s time to consume that message.
As with producing, we’ll use the CLI tool already available in the kafka broker’s docker container. The tool is called kafka-console-consumer.
Let’s start by opening a new tab/window of the terminal and connecting to Kafka broker’s container:
docker exec -it sn-kafka /bin/bash
Now, run kafka-console-consumer using the following command:
kafka-console-consumer --bootstrap-server localhost:9092 \ --topic first_topic
Aaaand….nothing happens. You're probably wondering where the message you just produced is. You need to know that unless told otherwise Kafka consumer reads only new messages (a.k.a. those arriving to the topic after the consumer is started).
Let's fix that. Stop the consumer by pressing Ctrl+C.
To read the messages that existed in the topic before we started the consumer, we must add the –from-beginning parameter to the kafka-console-consumer command:
kafka-console-consumer --bootstrap-server localhost:9092 \ --topic first_topic \ --from-beginning
And (after few seconds) there’s our message. If you now go back to the producer window and send a few more messages, you will see them immediately appear in the consumer window.
Congratulations, you’ve now managed to send messages to a Kafka topic called first_topic and to consume messages, both old and new, from the same topic using only CLI tools provided with Kafka.
You can now stop the producer and the consumer by pressing Ctrl+C.
I have created a Kafka mini-course that you can get absolutely free. Sign up for it over at the Coding Harbour.