DEV Community

Abhishek Gupta for Microsoft Azure

Posted on • Updated on

Azure Event Hubs multi-protocol support

Azure Event Hubs is a fully managed Platform-as-a-Service (PaaS) which serves as a data streaming platform and event ingestion service with the ability to receive and process millions of events per second. Client applications can leverage the following protocols to interact with the service: AMQP, Kafka and HTTPS.

This blog post will briefly cover multi-protocol support in Event Hubs and why it's important. The second part will present a simple yet practical example to demonstrate this in action

Supported protocols

Let's go over these real quick!

AMQP

Advanced Message Queueing Protocol (AMQP) 1.0 is the primary protocol for Azure Event Hubs. It's an OASIS standard that defines the protocol for asynchronously, securely, and reliably transferring messages between systems. To work with Azure Event Hubs you can use any of the language specific client SDKs which implement the AMQP protocol - this includes .NET, Java, Python, Go, Nodejs and C

These are SDKs are open source on GitHub

For a dive deep into how the AMQP protocol is used in Event Hubs please refer to the documentation

Apache Kafka

Event Hubs provides a Kafka compatible endpoint that can be used by your existing Kafka based applications as an alternative to running your own Kafka cluster. This eliminates the need for setting up Kafka and Zookeeper clusters and ongoing infrastructure maintenance. Event Hubs and Kafka are conceptually the same i.e. a Kafka cluster corresponds to an Event Hub namespace, a Kafka topic is equivalent to an Event Hub etc. For details please refer to the documentation

You can leverage existing Kafka clients in any programming language. I encourage you to check out the quickstarts and tutorials on GitHub which demonstrate Event Hubs usage with Kafka clients in Java, Node, dotnet, Python, Go etc.

HTTPS

Event hubs also offers a REST API which can be leveraged by any HTTP client. Some of the operations include:

The value proposition

This multi-protocol capability provides a lot of flexibility and you can combine them for specific use cases. Here are a few examples:

  • Write to Kafka endpoint and build a Serverless consume with Azure Functions using Trigger based integration (this scenario is will be covered in a subsequent blog post)
  • Write using HTTPS and read with Kafka - e.g. your front end apps can send click-stream events using HTTPs and you can process these using Kafka clients
  • Use Event Hubs SDKs (AMQP based) for sending events and read then with the Kafka protocol

Since you can mix and match protocols, how do applications at the producing and consuming ends make sense of the messages being sent and received? Some great examples and best practices have been covered here but here is the Tl;DR - this is made possible by the fact that all the protocol clients (Kafka, AMQP, HTTPS) treat messages as raw bytes. These bytes are sent over the wire using a specific protocol and the same bytes are received on the consumer side even if it's using a different protocol. If the producing or consuming application uses a specific format e.g. JSON for an HTTP protocol and a Java POJO in a Java based Kafka consumer client, it is up to the client application to convert to/from the bytes to the format which it wants to work with.

As promised, the next blog post will cover Serverless processing with Event Hubs and Azure Functions. Stay tuned!

Top comments (0)