DEV Community

Victor Tatai
Victor Tatai

Posted on

Make Kafka serverless by connecting it to Google Cloud Functions

So you want to do some serverless, but are constrained by your own infrastructure? Namely, your company shifts a ton of Kafka messages around, and it would be great if you could quickly whip out a function that just consumes a message and processes it, calling let's say an HTTP API.

Unfortunately, Google Cloud Functions does not natively support Kafka, instead being triggered (usually) by HTTP requests, or Google Cloud Pub/Sub. Fortunately though, there is a way to integrate Kafka with Pub/Sub so that your Kafka messages are forwarded to Pub/Sub, then triggering your function.

Perhaps it would be easier in a picture?

Kafka - Connector - Function

IT plumbing, no wrenches required

In rough order, the tasks we need to do:

Preamble: you need a GCP account with billing enabled. You know, as in, you are able to use GCP. I'm not responsible for any costs you incur though.

Deploy Kafka

Create a GCP project if you haven't got one already, and go to Compute Engine > VM Instances. Click on CREATE INSTANCE, then on Marketplace, search for Kafka (Google Click To Deploy), and smash that LAUNCH button. On the next page, leave all settings as default, and again click on Launch. Now pick your teeth while it deploys.
Once it is deployed, create a topic:

  1. Click on SSH
  2. Run /opt/kafka/bin/kafka-topics.sh --zookeeper localhost:2181 --create --topic hello-topic --partitions 3 --replication-factor 1
  3. (Optional) Run the following to lower retention to 10s (makes it easier to test): /opt/kafka/bin/kafka-topics.sh --zookeeper localhost:2181 --alter --topic hello-topic --config retention.ms=10000

Leave the SSH shell open, you will need it later.

Create A Cloud Function

Now the fun part - go to Cloud Function product console, and click on CREATE FUNCTION. Give the function an original name, let's say helloWorld, and select as trigger Cloud Pub/Sub. In the topic selection, click on CREATE A TOPIC, name it hello_world, and create it.
For source code, use the inline editor, and select Go 1.11 (if you want Node, I won't judge, but won't help you either). Enter the function code:


Note the special struct for deserializing Pub/Sub.
In the Function to execute box, enter HelloPubSub, then click CREATE. Watch in awe as the magic happens.

Build The Kafka Connector

Follow the instructions on Github, and you end up with cps-kafka-connector.jar.

For this simple example, we will just copy the Connector jar directly into the Kafka server, and run the connector manually (not really production quality, but it will get you going). To do that, go back to the SSH shell, and upload the file cps-kafka-connector.jar.
Twiddle your thumbs while that uploads.

Configure and Run the Connector

Ok so now we get to the tricky part, which is configuring the Connector. First, let's put the files in the right locations:

  1. In the SSH window, create a file named cps-sink-connector.properties with the following properties:
  2. Copy the lib into Kafka: sudo cp cps-kafka-connector.jar /opt/kafka/libs/

On the next steps we configure IAM (as described in the Connector docs):

  1. Go to the Service Accounts panel, and click on CREATE SERVICE ACCOUNT
  2. On the Role step, select Pub/Sub Admin
  3. On the last step, click CREATE KEY (JSON), and create the account
  4. Upload the JSON key you just created to the Kafka box (using the SSH console)
  5. Export the credentials in the shell by doing export GOOGLE_APPLICATION_CREDENTIALS=/home/<USER>/<CREDENTIALS FILE>.json

The environment variable above is the "standard" way for providing credentials to GCP.

The last step is to run the Connector in standalone mode:

sudo -E /opt/kafka/bin/connect-standalone.sh /opt/kafka/config/connect-standalone.properties ./cps-sink-connector.properties

(the -E option is to just pass along the export you did above)

If everything went according to plan, now you have a bunch of log messages in your window, and no exceptions (yay!).

Profit

Now let's finally post a message to Kafka and see it trigger our function.
First, open a new SSH session to Kafka, and run kafka-console-producer.sh --topic hello_world --broker-list localhost:9092. Once you get the console >, enter flying spaghetti monster.

Now open the GCP Log, select Cloud Function, and you should see:
Alt Text

If everything worked out, it is time for a beer!🍺

If not 🤔, leave a message below, who knows, perhaps I'll be able to help!

Top comments (0)