DEV Community

Cover image for 9 Insanely Helpful Kafka Commands Every Backend Developer Must Know
Ahmed Gulab Khan
Ahmed Gulab Khan

Posted on

9 Insanely Helpful Kafka Commands Every Backend Developer Must Know

In this article, I'm going to list out the most popular Kafka CLI commands you should know as a Developer. Before we begin, I'd recommend you to go over this article in order to get a brief understanding of what Kafka is and how it works.

If you want to setup Kafka locally, you can check this article out which helps you setup both Kafka and Zookeeper locally in just 3 simple steps. And for running the below commands, I'm going to use the same Kafka setup as mentioned in that article. Which means that the Kafka version being used here would be 0.10.1.0 built for the Scala version 2.11.

Now without any further delay, let's go through the list of commands

1. List all the Kafka topics

bin/kafka-topics.sh --list --zookeeper localhost:2181
Enter fullscreen mode Exit fullscreen mode

2. Create a topic

Creates a Kafka topic named my-first-kafka-topic with partitions and replication factor both set as 1

bin/kafka-topics.sh --create --topic my-first-kafka-topic --zookeeper localhost:2181 --partitions 1 --replication-factor 1
Enter fullscreen mode Exit fullscreen mode

For simplicity, I have set the partitions and the replication factor as 1 for the topic, but you can always play around with this configuration. If you don't know what partitions and replication factor mean in the Kafka context, I'd recommend you to go through this article in order to get a good understanding

3. Describe a topic

Describes the topic mentioned in the command

bin/kafka-topics.sh --describe --zookeeper localhost:2181 --topic my-first-kafka-topic
Enter fullscreen mode Exit fullscreen mode

4. Update topic Configuration

Update the configuration of the mentioned topic (my-first-kafka-topic in this case). Here, we are updating the property cleanup.policy to be compact, the compression.type to be gzip and the retention.ms to be 3600000

bin/kafka-configs.sh --zookeeper localhost:2181 --alter --entity-type topics --entity-name my-first-kafka-topic --add-config cleanup.policy=compact,compression.type=gzip,retention.ms=3600000
Enter fullscreen mode Exit fullscreen mode

5. Delete a topic

Deletes the topic mentioned in the command

bin/kafka-topics.sh --delete --zookeeper localhost:2181 --topic my-first-kafka-topic
Enter fullscreen mode Exit fullscreen mode

6. Produce messages to a topic

Opens a prompt where you can type any message and hit enter to publish it to the mentioned topic.

bin/kafka-console-producer.sh --broker-list localhost:9092 --topic my-first-kafka-topic
Enter fullscreen mode Exit fullscreen mode

You can keep typing any number of messages and hit enter after every message to publish all the typed messages sequentially. If you're done you can exit using Ctrl+C

7. Consume messages from a topic

  • Consume messages from the mentioned topic
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-first-kafka-topic
Enter fullscreen mode Exit fullscreen mode

The above command only starts consuming messages from the topic from the instant the command is executed and none of the previous messages are consumed. When you do not specify the Consumer Group for the consumer, Kafka automatically creates a random Consumer Group called console-consumer- for the consumer. And every time you run the above command a new random consumer group is created for the consumer.

  • Consume messages from the mentioned topic where the consumer belongs to the mentioned Consumer Group (my-first-consumer-group in this case)
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-first-kafka-topic --consumer-property group.id=my-first-consumer-group
Enter fullscreen mode Exit fullscreen mode

8. Consume messages from a topic from the beginning

From the above point we see that irrespective of whether our consumer belongs to a random consumer group or a consumer group created by us, the messages are only consumed from the instant that the consumer starts, and no previous messages are consumed. The --from-beginning argument makes sure that when a consumer group is created, it starts consuming all the messages from beginning.

  • Consume messages from the mentioned topic from the beginning
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-first-kafka-topic --from-beginning
Enter fullscreen mode Exit fullscreen mode

Note: Once we run the above command, we see that a random consumer group is created and our consumer falls under this random consumer group, so the messages are consumed from the beginning. Now if we run the above command once again, a new random consumer group is created once again and the new consumer falls under this new consumer group and all the messages are consumed again from the beginning. If we only want our consumer to consume the messages from the start when we first start the consumer (i.e; the first time the consumer group is created and our consumer joins this consumer group), then we have to make sure that we keep using the same consumer group in the above command the next time we start our consumer; this way our consumer only consumes messages from the start the first time it starts up, and in the case when this consumer is restarted, the messages are only consumed from the last committed offset but not from the very beginning. So, it is important that you specify a consumer group, since if it's not specified a random consumer group is created each time the above command is run and the messages are consumed from the very beginning every single time

  • Consume messages from the mentioned topic where the consumer belongs to the mentioned Consumer Group
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-first-kafka-topic --consumer-property group.id=my-first-consumer-group --from-beginning
Enter fullscreen mode Exit fullscreen mode

9. List all the Consumer Groups

bin/kafka-consumer-groups.sh --list --bootstrap-server localhost:9092
Enter fullscreen mode Exit fullscreen mode

More Kafka articles that you can go through:

  1. Apache Kafka: A Basic Intro
  2. Kafka Partitions and Consumer Groups in 6 mins
  3. 3 Simple steps to set up Kafka locally using Docker

Follow for the next Kafka blog in the series. I shall also be posting more articles talking about Software engineering concepts.

Top comments (0)