DEV Community

Lorna Jane Mitchell
Lorna Jane Mitchell

Posted on • Originally published at aiven.io on

Kafka v. RabbitMQ - a comparison

Kafka v. RabbitMQ - a comparison

Alt Text

Today's post is all about queues and how to choose a solution that fits your application's requirements. We'll go over the key design features of RabbitMQ and Kafka and outline how they process queues differently. Read on, and we'll help you decide which is a better match for you!

There are many roads that can lead to the moment you decide you need a queue. Queues are an excellent way to loosely couple many different components and allow them to exchange data without detailed knowledge about one another. Using a queue is also an excellent way to distribute work between multiple nodes to perform asynchronous tasks.

Queues come in different flavours and success is morely likely when you can use the queue that best fits the shape of your use case. There's some overlap between the use cases but in general it can be summarised as choosing between a job queue and a message queue.

Processed and Forgotten: RabbitMQ

A job queue such as RabbitMQ is a good choice when work is being delegated to an asynchronous endpoint, such as a serverless function. The classic example is resizing an image. When a user uploads a new image, the application needs to produce a thumbnail or some custom sizes for that image, but the user shouldn't have to wait for that work to be completed before getting on with what they were doing. So we can put the request onto a queue and carry on with generic placeholder images, until the resizing is complete.

RabbitMQ is a popular message broker and is a good fit for those job-shaped application requirements. It's an open source tool and here at Aiven we're big fans of all the open source tools. RabbitMQ supports multiple protocols, has predefined exchange types and has configurable flexible routing. When you work with a job queue, the message broker transports the messages to the place where they are processed. The job gets processed once (technically "at least once"), and then it is completed and is removed.

Event-Driven Application: Apache Kafka

In contrast with the RabbitMQ model, message queues can also be more of data bus in architecture terms, a conduit for communicating the events throughout the application. A message (called a "record" in our favourite message queue tool, Kafka is put onto the bus and then any interested consumers, now or in the future, can access and consume the message. The message persists so that other consumers can also access the data, either at the time that the data is added ("produced" in Kafka terminology) to the message bus, or later if we decide we want to revisit the data for additional analysis.

We commonly see Apache Kafka used in event-driven applications where data must flow between multiple components in the application. Using this distributed message bus model gives a great deal of scalability and it's not a coincidence that the roots of the open source Apache Kafka tool are in the software stack of LinkedIn, a company with a lot of data and many components consuming it. Kafka is a distributed log of past events, which gives the advantage that every past change is also always still available, so you can build features based on the events or simply have the peace of mind that the data will always be availabe for inspection or audit if needed.

Kafka is quite approachable as a technology, you can either install it yourself or take up the free trial available on the Aiven platform to get started. It's ideal for getting to know the technology, with a friendly web interface to get you on the right track and a selection of connectors that can be added easily. It can be scaled up to handle colossal workloads and we see some very large clients with some great performance on our platform.

More Resources

Thinking about the shape of the data requirements that you have will ensure you pick a queue that works for you. If it's a task on a task list, then try RabbitMQ. But for data that needs to flow around your application and drive multiple integrations, Kakfa is probably your best bet. If you'd like to know more, then some of these links may be useful:

Of course, we'd love you to try Kafka on the Aiven platform if you read this far and think you have Kafka-shaped requirements! Sign up and let us know what you build.

Top comments (0)