DEV Community

loading...
Cover image for Event-Driven App with Dapr & Azure Event Hubs

Event-Driven App with Dapr & Azure Event Hubs

ksivamuthu profile image Sivamuthu Kumar ・6 min read

In this blog post, I will walk through how to integrate Dapr - Distributed application runtime in Kubernetes with Azure Event Hubs.

What is Dapr?

Dapr is a portable, event-driven runtime that makes it easy for developers to build resilient, microservice stateless and stateful applications that run on the cloud and edge.

Dapr consists of a set of building blocks accessed by standard HTTP or gRPC APIs that can be called from any programming language. Dapr building blocks include the capabilities needed for building microservices applications. Dapr provides some of the building blocks such as Service Invocation, State management, Pub/Sub, Event-driven resource binding, Virtual actors, and Distributed tracing.

GitHub logo dapr / dapr

Dapr is a portable, event-driven, runtime for building distributed applications across cloud and edge.

Key features of Dapr & Building blocks.

  • Pluggable component model
  • Standard APIs using HTTP and gRPC.
  • Portable across different supported infrastructures
  • Extensibility
  • Ease of integration
  • Runs anywhere as a process / in a docker container.
  • Dynamically choose different implementation.

Dapr provides a dedicated CLI for easy debugging and clients for Java, .NET, Go, Javascript, Python, Rust, and C++.

Binding

In event-driven applications, your app can handle events from external systems or trigger events in external systems. Input and Output bindings help you to receive and send events through Dapr runtime abstraction.

Bindings remove the complexities of integrating with external systems such as queue, message bus, etc., and help developers focus on business logic. You can switch between the bindings at runtime and keep the code free from vendor SDKs or libraries to create portable applications.

Prerequisites

Getting started with Dapr on Kubernetes

Install Dapr on the Kubernetes cluster using a helm chart. You can install in minikube or managed kubernetes cluster such as AKS or any.

  • Make sure Helm 3 is installed on your machine.

  • Add daprio Azure Container Registry as a Helm repo.

   $ helm repo add dapr https://daprio.azurecr.io/helm/v1/repo
   $ helm repo update
Enter fullscreen mode Exit fullscreen mode
  • Create dapr-system namespace on your kubernetes cluster.
   $ kubectl create namespace dapr-system
Enter fullscreen mode Exit fullscreen mode
  • Install the Dapr chart on your cluster in the dapr-system namespace
   $ helm install dapr dapr/dapr --namespace dapr-system
   $ kubectl get pods -n dapr-system -w // Verify the pods are created
Enter fullscreen mode Exit fullscreen mode

Setup Kafka Enabled Azure Event Hubs

The first step is to create the Azure Event Hubs in Azure Portal and get Event Hub Connection String

After you complete the steps from above links from Microsoft Docs, you will have

  1. Event Hub Connection String
  2. Event Hub Namespace Name

You need this to configure the input bindings.

Demo

In this demo, we are going to create an application that pushes the event to Azure Event Hub and the microservice that receives the event through Dapr input binding.

The event hub consumer microservice is deployed into the kubernetes cluster. The Dapr input binding - Azure Event Hub spec triggers the event hub consumer HTTP endpoint when an event is triggered from Event Hub.

To demo the portability, we are going to change the Azure EventHub input binding spec to Kafka input binding spec.

GitHub logo ksivamuthu / dapr-demo

DAPR Demo for a blog & talk

git clone https://github.com/ksivamuthu/dapr-demo.git
Enter fullscreen mode Exit fullscreen mode

Event Hub Producer

Event Hub Producer application - eventhub-producer is using Azure EventHubs NodeJS SDK to connect with Azure Event Hub.

Configure the connection string and event hub name in .env file or in your environment variables to load it in the nodejs process.

EVENTHUB_NAME= <EVENTHUB_NAME> //Replace
EVENTHUB_CONNECTION_STRING= <EVENTHUB_CONNECTION_STRING> //Replace
Enter fullscreen mode Exit fullscreen mode

Then, create the EventHubProducerClient to send the individual events to Azure Event Hub with a delay of two seconds for this demo purpose.

  const producer = new EventHubProducerClient(connectionString, eventHubName);

    const events = [
        'Order received',
        'Being prepared',
        'On the way',
        'Delivered'
    ];

    for (let index = 0; index < events.length; index++) {       
        console.log(`Sending event: ${events[index]}`) 
        const batch = await producer.createBatch();
        batch.tryAdd({ body: { msg: events[index] }});
        await producer.sendBatch(batch);        
        await delay(2000);
    }

    await producer.close();
Enter fullscreen mode Exit fullscreen mode

Event Hub Consumer

Event Hub Consumer is the microservice application that is going to be deployed in the Kubernetes cluster.

It's the simple express application with a post endpoint which logs the data received.

const express = require('express')
const app = express()
app.use(express.json())

const port = 3000

app.post('/receive-event', (req, res) => {
    console.log(req.body)
    res.status(200).send()
})

app.listen(port, () => console.log(`Kafka consumer app listening on port ${port}!`))
Enter fullscreen mode Exit fullscreen mode

Deploy the app in Kubernetes

To build the container image with the eventhub-consumer application,

cd bindings/eventhub-consumer
docker build -t eventhub-consumer:latest .
Enter fullscreen mode Exit fullscreen mode

Deploy the app using a YAML file that has deployment, service objects.

kubectl apply -f bindings/yamls/eventhub-consumer.yaml
Enter fullscreen mode Exit fullscreen mode

In the deployment YAML file, I would like to show you the annotations Dapr runtime requires to know which app and port the input binding should call.

---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: eventhub-consumer
  labels:
    app: eventhub-consumer
spec:
  replicas: 1
  selector:
    matchLabels:
      app: eventhub-consumer
  template:
    metadata:
      labels:
        app: eventhub-consumer
      annotations:
        dapr.io/enabled: "true"
        dapr.io/id: "eventhub-consumer"
        dapr.io/port: "3000"
    spec:
      containers:
      - name: node
        image: eventhub-consumer:latest
        ports:
        - containerPort: 3000
        imagePullPolicy: IfNotPresent
Enter fullscreen mode Exit fullscreen mode

Azure Event Hub Input Binding

Now it's time to integrate the producer and consumer through Dapr runtime. There are different specs available for input binding. In this demo, we are going to create the input binding with Azure Event Hub spec.

Update the connection string in the below YAML with the Azure Event Hub Connection String. The metadata.name is nothing but your post endpoint of eventhub-consumer application.

Apply this YAML in your kubernetes cluster.

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: receive-event
spec:
  type: bindings.azure.eventhubs
  metadata:
  - name: connectionString 
    value: CONNECTION_STRING # replace
Enter fullscreen mode Exit fullscreen mode

Run the eventhub-producer application to send the events. You may also need to watch the logs of the eventhub-consumer pod to see the received events.

cd eventhub-producer
node index.js

//and in another tab
kubectl logs <pod_name> -f <container_name>
Enter fullscreen mode Exit fullscreen mode

You should see the output like below

// eventhub-producer
Sending event: Order received
Sending event: Being prepared
Sending event: On the way
Sending event: Delivered
Enter fullscreen mode Exit fullscreen mode
// eventhub-consumer kubernetes logs
Kafka consumer app listening on port 3000!
{ msg: 'Order received' }
{ msg: 'Being prepared' }
{ msg: 'On the way' }
{ msg: 'Delivered' }
Enter fullscreen mode Exit fullscreen mode

Kafka Input Binding

It's clear how Dapr abstracts the runtime binding in which the eventhub-consumer is not aware of the vendor libraries and abstracted through HTTP/gRPC endpoints.

Now, we are going to switch the input bindings from Azure Event Hub to Kafka. For brevity of this demo, the same Azure Event Hub is used to connect, but this time through Kafka API of Azure Event Hub.

So what will change? - Only the input binding.

Delete the Azure Event Hub Input binding spec in kubernetes.

kubectl delete -f yamls/eventhub-input-binding.yaml
Enter fullscreen mode Exit fullscreen mode

Create the Kafka Input binding spec in Kubernetes. Use the below YAML to configure Kafka binding with Azure Event Hub Connection String and Namespace.

BROKER_URL=EVENT_HUB_NAME.servicebus.windows.net:9093
EVENTHUB_NAMESPACE= EventHub Namespace Name
CONNECTION_STRING= EventHub Connection String

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: receive-event
spec:
  type: bindings.kafka
  metadata:
    - name: brokers
      value: BROKER_URL # replace
    - name: topics
      value: EVENTHUB_NAMESPACE # replace
    - name: consumerGroup
      value: $Default
    - name: authRequired
      value: "true"
    - name: saslUsername
      value: $ConnectionString
    - name: saslPassword
      value: CONNECTION_STRING # Replace 
Enter fullscreen mode Exit fullscreen mode

Run the eventhub-producer application to see you got the below output from Kafka Input binding spec.

You should see the output like below

// eventhub-producer
Sending event: Order received
Sending event: Being prepared
Sending event: On the way
Sending event: Delivered
Enter fullscreen mode Exit fullscreen mode
// eventhub-consumer kubernetes logs
Kafka consumer app listening on port 3000!
{ msg: 'Order received' }
{ msg: 'Being prepared' }
{ msg: 'On the way' }
{ msg: 'Delivered' }
Enter fullscreen mode Exit fullscreen mode

Conclusion

In this post, you saw how to use Dapr bindings to connect Azure Event Hubs and the portability of using different binding specs without changing the application code.

Stay tuned for more posts on Dapr and Kubernetes. Like, follow and comment, DM me at my Twitter @ksivamuthu, your feedback is most welcome.

Discussion (0)

pic
Editor guide