DEV Community

Abhishek Gupta for Microsoft Azure

Posted on • Updated on

Tutorial: Integrate Dapr and Azure Event Hubs using Kafka bindings

A previous blog post demonstrated how to use Azure Event Hubs as a resource binding in Dapr. Resource Bindings provide a way to trigger an application with events (Input Binding) from external systems or invoke an external system (Output Binding) with optional data payloads. These "external systems" could be anything: a queue, messaging pipeline, cloud-service, filesystem, etc. In this blog, we will walk through an example of how to integrate Kafka in your application by using it as a Dapr binding.

Dapr (aka Distributed Application Runtime) is an open-source, portable runtime to help developers build resilient, microservice stateless and stateful applications. If you don't know about Dapr yet, I would recommend checking out the Github repo and going through the "Getting Started" guide. You can also read up on some of my previous blog posts as well.

Azure Event Hubs also provides Apache Kafka support by exposing a Kafka compatible endpoint that can be used by your existing Kafka based applications as an alternative to running your own Kafka cluster. SASL auth for Kafka bindings were added in Dapr release 0.2.0, thus making it possible to use Azure Event Hubs via the Kafka bindings support in Dapr - this is what this blog will demonstrate.

Before we proceed further, let's set up what we need first.

Pre-requisites

  • Dapr CLI and runtime components
  • Azure Event Hubs

Dapr

Please go through the Dapr getting started guide for instructions on how to install the Dapr CLI

e.g. for mac (installs Dapr CLI to /usr/local/bin)

curl -fsSL https://raw.githubusercontent.com/dapr/cli/master/install/install.sh | /bin/bash 
Enter fullscreen mode Exit fullscreen mode

Once you have the CLI, you can use dapr init to run locally or dapr init --kubernetes to run it on a Kubernetes cluster.

Setup Azure Event Hubs

If you dont' already have a Microsoft Azure account, go ahead and sign up for a free one!. Once you're done you can quickly setup Azure Event Hubs using either of the following quickstarts:

You should now have an Event Hub instance with a namespace and associated Event Hub (topic). As a final step you need to get the connection string in order to authenticate to Event Hubs - use this guide to finish this step.

Overview

Sample app consists of:

  • A producer app that sends events to Azure Event Hubs. This is a standalone Go app which uses Sarama client to talk to the Azure Event Hubs Kafka endpoint.
  • A consumer app which consumes from the Kafka topic and prints out the data. This app is run using Dapr

Run the consumer application with Dapr

Start by cloning the repo

git clone https://github.com/abhirockzz/dapr-kafka-eventhubs-bindings
Enter fullscreen mode Exit fullscreen mode

Here is the binding component YAML

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: timebound
spec:
  type: bindings.kafka
  metadata:
    - name: brokers
      value: [replace]
    - name: topics
      value: [replace]
    - name: consumerGroup
      value: $Default
    - name: authRequired
      value: "true"
    - name: saslUsername
      value: $ConnectionString
    - name: saslPassword
      value: [replace]
Enter fullscreen mode Exit fullscreen mode

Update components/eventhubs_binding.yaml to include Azure Event Hubs details

  • brokers - replace this with the Azure Event Hubs endpoint e.g. foobar.servicebus.windows.net:9093 where foobar is the Event Hubs namespace
  • saslPassword - this needs to be replaced with the Event Hubs connection string - use this guide (as mentioned before)
  • consumerGroup - you can continue using $Default as the value or create a new consumer group in Azure Event Hubs (using Azure CLI or portal) and use that

Start the Go app which uses the Azure Event Hubs Input Bindings

cd app
export APP_PORT=9090
dapr run --app-port $APP_PORT go run consumer.go
Enter fullscreen mode Exit fullscreen mode

You should see logs similar to this:

ℹī¸  Starting Dapr with id Bugarrow-Walker. HTTP Port: 52089. gRPC Port: 52090
✅  You're up and running! Both Dapr and your app logs will appear here.

== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="starting Dapr Runtime -- version 0.3.0 -- commit v0.3.0-rc.0-1-gfe6c306-dirty"
== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="log level set to: info"
== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="standalone mode configured"
== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="dapr id: Bugarrow-Walker"
== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="loaded component messagebus (pubsub.redis)"
== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="loaded component statestore (state.redis)"
== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="loaded component eventhubs-input (bindings.kafka)"
== DAPR == time="2020-01-14T19:35:09+05:30" level=info msg="application protocol: http. waiting on port 9090"
== DAPR == time="2020-01-14T19:35:10+05:30" level=info msg="application discovered on port 9090"
Enter fullscreen mode Exit fullscreen mode

Run Azure Event Hubs producer application

Set the required environment variables:

export EVENTHUBS_CONNECTION_STRING="[replace with connection string]"
export EVENTHUBS_BROKER=[replace with broker endpoint]
export EVENTHUBS_TOPIC=[replace with topic name]
export EVENTHUBS_USERNAME="\$ConnectionString"
Enter fullscreen mode Exit fullscreen mode

you don't need to modify EVENTHUBS_USERNAME

Run the producer app - it will keep sending messages to the specified Event Hubs topic until it's stopped (press ctrl+c to stop the app)

cd producer
go run producer.go
Enter fullscreen mode Exit fullscreen mode

If everything goes ok, you should see the following logs in the producer app:

Event Hubs broker [foobar.servicebus.windows.net:9093]
Event Hubs topic test
Waiting for ctrl+c
sent message {"time":"Tue Jan 14 19:41:53 2020"} to partition 3 offset 523
sent message {"time":"Tue Jan 14 19:41:56 2020"} to partition 0 offset 527
sent message {"time":"Tue Jan 14 19:41:59 2020"} to partition 4 offset 456
sent message {"time":"Tue Jan 14 19:42:02 2020"} to partition 2 offset 486
sent message {"time":"Tue Jan 14 19:42:06 2020"} to partition 0 offset 528
Enter fullscreen mode Exit fullscreen mode

Confirm

Check Dapr application logs, you should see the messages received from Event Hubs.

== APP == data from Event Hubs '{Tue Jan 14 19:35:21 2020}'
== APP == data from Event Hubs '{Tue Jan 14 19:41:53 2020}'
== APP == data from Event Hubs '{Tue Jan 14 19:41:56 2020}'
== APP == data from Event Hubs '{Tue Jan 14 19:41:59 2020}'
== APP == data from Event Hubs '{Tue Jan 14 19:42:02 2020}'
== APP == data from Event Hubs '{Tue Jan 14 19:42:06 2020}'
Enter fullscreen mode Exit fullscreen mode

Behind the scenes...

Here is a summary of how it works:

The input binding defines the connection parameters for the Kafka cluster to connect to. In addition to those parameters, the metadata.name attribute is important.

The consumer app exposes a REST endpoint at /timebound - this is the same as the name of the Input Binding component (not a coincidence!)

func main() {
    http.HandleFunc("/timebound", func(rw http.ResponseWriter, req *http.Request) {
        var _time TheTime

        err := json.NewDecoder(req.Body).Decode(&_time)
        if err != nil {
            fmt.Println("error reading message from event hub binding", err)
            rw.WriteHeader(500)
            return
        }
        fmt.Printf("data from Event Hubs '%s'\n", _time)
        rw.WriteHeader(200)
    })
    http.ListenAndServe(":"+port, nil)
}
Enter fullscreen mode Exit fullscreen mode

Dapr runtime does the heavy lifting of consuming from Event Hubs and making sure that it invokes the Go application with a POST request at the /timebound endpoint with the event payload. The app logic is then executed, which in this case is simply logging to standard output.

Summary

In this blog post, you saw how to use Dapr bindings to connect integrate your Kafka based applications using Azure Event Hubs.

As the time of writing, Dapr is in alpha state (v0.3.0) and gladly accepting community contributions 😃 Vist https://github.com/dapr/dapr to dive in!

If you found this article helpful, please like and follow 🙌 Happy to get feedback via Twitter or just drop a comment.

Top comments (0)