DEV Community

Cover image for Building a realtime performance monitoring system with Kafka and Go
Siddharth Venkatesh
Siddharth Venkatesh

Posted on

Building a realtime performance monitoring system with Kafka and Go

Recently, I had a chance to try out Apache's Kafka for a monitoring service and I was pleasantly surprised how you could set up a full fledged event streaming system in a few lines of code. I quickly realised we could be building powerful systems with Kafka at the centre of things. Notification systems, distributed database synchronisations, monitoring systems are some of the applications that come to mind when thinking of Kafka's use-cases.
In my quest to understand Kafka a bit more deeply, I tried setting up a system monitoring application which looks for system stats like CPU and RAM usage and visualises them in a dashboard.

The tools I went with for this are

  • Kafka as a message queue
  • Golang for a couple of microservices
  • React for the dashboard
  • Docker and docker compose for orchestration

Architecture

The architecture I'm going for is this.

architecture

Services

Service Description
System stats Golang service which captures CPU and RAM usage from host machine
Broker Kafka broker which facilitates event streaming
Producer Golang service which receives usage data from system stats service and pushes it into Kafka
Consumer Golang service which subscribes to Kafka broker and streams data to dashboard
Dashboard Visualises system stats in real time. Gets system stats data from Consumer through a WebSocket. Built using React

TL;DR

If you understand the architecture and the purpose of each service above, you can find the implementation in the below repository

Realtime system monitoring

If you do decide to stick around, we'll be going over in detail about these services.

Deep dive

Alright! Thanks for sticking around. We'll go over key parts of each service and finally we'll see how this comes together with docker compose.

System Stats

The main purpose of this service is to collect system usage statistics from the host machine and send it to the producer service.
Here, I'm reading memory stats from the /proc/meminfo file which contains information about the current memory usage. This file is available in all linux/unix based systems

We could have used a much more focussed tool like Prometheus or Cadvisor to gather system stats, but that is not the main objective of this article.

We'd like to collect memory stats every second. So, we need a cron job which runs every second and reads memory info from /proc/meminfo file. We'll write a function which reads, parses and returns memory stats.

read memory stats

Next, we need a function which takes in the stats and sends it to the producer. This will be a simple HTTP POST call.

push to producer

Finally, we need to put these two together in our cron job. Every time the job runs, these two functions need to be called in series. Our main function for this service looks like this.

main_system_stats

Awesome, now we have a service which collects memory info from our host machine every second and sends it to the producer

Producer

The purpose of this producer service is to receive system stats and push it into the Kafka broker. To do this, we have a HTTP POST endpoint which get the stats in the body and writes it into a Kafka topic.

Our main function in this service is very simple. There is a single POST endpoint to receive stats. I'm using Gin framework for routing here.

producer_main

Before we write to Kafka, we need to setup our Kafka Writer. Let's do that.

kafka writer

Now, we can setup our PushToKafka function which is the handler for our POST endpoint.

push to kafka

Great! We have a Producer service which receives system stats via a POST endpoint and writes this data into our Kafka broker.

Consumer

The Consumer service does two things.

  • Subscribe to Kafka topic to receive system stats data
  • Relay system stats data to Dashboard via WebSockets

Before our service can listen to Kafka messages, we need to setup a Kafka Reader

consumer kafka reader

Now our service can subscribe and listen to data pushed into Kafka queue by our Producer. We'll setup a listen function which reads messages from Kafka

consumer listen

Note that it takes in a callback function. This callback function gets called whenever there is a message from Kafka. We'll use this callback function to relay systems stats to Dashboard service

The next step is to send system stats data to Dashboard whenever we receive it from Kafka. But before we do this, we need an endpoint, specifically a socket endpoint which our Dashboard service can connect to. We'll define this endpoint in our main function

socket endpoint

And our upgrade handler looks like this

socket upgrader

To establish a Websocket connection, the HTTP connection between the service and client has to be upgraded to use WebSocket protocol.

The connections.Subscribe(conn) call is to keep track of all the socket connections.

The final step in our Consumer service is to relay the messages from Kafka queue to Dashboard service. For this, we setup a function called SendMessage, which will be our callback function for Kafka Listen function.

consumer main

Awesome! Now we have a Consumer service which listens to messages in our Kafka queue and relays that message to our Dashboard service through a WebSocket.

Dashboard

Dashboard is a very simple service which connects to the Consumer service via a WebSocket and renders the system stats in a table UI.

Without going into much detail about how to setup a React application or how the markup and styling will be, the important part here is to create a socket connection with Consumer service's socket endpoint and setup an onmessage callback function on our socket.

dashboard effect

We have a React component with system stats as our state. We update this state whenever we receive data from the WebSocket.

Kafka Broker

The final service to setup is indeed Kafka Broker which facilitates this whole message queue.

I'm using an example docker compose config from Kafka's Github

Kafka-docker-compose

kafka compose

That's it! We're done with our setup.

Putting it all together

Now if we run all of the services using docker-compose,



docker-compose build
docker-compose up


Enter fullscreen mode Exit fullscreen mode

Navigate to http://localhost:WEB_UI_PORT, we can see our dashboard getting updated every second with our system stats.

output gif

Awesome! As you can see, our table gets update with latest value of systems stats every second.

I haven't gone into detail about how to setup build process for our services. Please refer Dockerfile and docker-compose.yml files for more information.

Conclusion

This is a barebones Kafka setup which does nothing more than relay messages in a single topic. I wrote this article just to get a basic understanding of what Kafka is, and to understand what Kafka could be used for. Hope folks reading this got something out of it.

Thanks for coming through, cheers!

Oldest comments (1)

Collapse
 
zuhdan profile image
zuhdanubay

Interesting.. do you have plan for this project to be open source?