DEV Community

GaneshMani
GaneshMani

Posted on • Updated on • Originally published at cloudnweb.dev

How Apache Kafka works? Kafka Series - Part 1

To Configure the Apache Kafka in local machine and Run the server. Kafka Series Part 2

what is kafka?

Firstly, Kafka is a streaming platform which follows the pub/sub pattern for sending the messages/monitor the events

Usecase of kafka

Kafka is generally used for two broad classes of applications:

  • Building real-time streaming data pipelines that reliably get data between systems or applications
  • Building real-time streaming applications that transform or react to the streams of data

Kafka Core API'S :

  • Producer API : it allows an application to publish a stream of records to a particular topic. we will see what topic is in the later part of this article.
  • Consumer API : it allows an application to subscribe to one or more topic and process the stream of records produced to them
  • Streams API : allows an application to act as a stream processor, consuming an input stream from one or more topics and producing an output stream to one or more output topics.
  • Connector API : allows building and running reusable producers or consumers that connect Kafka topics to existing applications or data systems.

How kafka works ?

In layman terms, Producer pushes the records into Kafka broker where consumer take those records and take action according to that message/event.

Kafka Architectural Overview

what is Topic and Partition ?

Topic in Kafka is nothing but a feed where a producer can push the records and consumer can get the records. Topics in Kafka are always multi-subscriber; that is, a topic can have zero, one, or many consumers that subscribe to the data written to it.

To make a topic fault tolerant and to handle the throughput, topic is maintained in different partitions. partitions can be in the same disk or in different disk.

What is Kafka Broker ?

Kafka Broker contains the cluster. Each Kafka Broker contains the topic log partitions. Connecting to one broker starts a client to the entire Kafka cluster. For failover, you want to start with at least three to five brokers. A Kafka cluster can have, 10, 100, or 1,000 brokers in a cluster if needed.

Kafka Broker Overview

To Configure the Apache Kafka in local machine and Run the server. Kafka Series Part 2

Top comments (4)

Collapse
 
learnwithparam profile image
Paramanantham Harrison • Edited

Thank you for doing this series. You can add one more blog dedicated to docker, docker compose with kafka for easy local development setup. It will make life easier to install zookeeper, kafka etc using docker.

There is a functionality to add series with links for other parts in dev.to, check help section while writing.
Also add canonical_url to your blog link, else your blog content will be tagged as duplicates by Google bots.

Collapse
 
ganeshmani profile image
GaneshMani

Sure bro. Thank you for the valuable comment.

Collapse
 
itsraghz profile image
Raghavan alias Saravanan Muthu

Hey Ganesh, your blogs looks very interesting and insightful. Thank you.

Collapse
 
ganeshmani profile image
GaneshMani

Thank you.. Glad you like it. :-)