In my last post on responding to Calls for Papers, I outlined a few things to consider and some questions to ask yourself to get started with writing an abstract. While it’s easy for me to say “Consider X…” or “Ask yourself Y…”, I realize that it might still feel a bit fuzzy. To counter that, I want to take that hand-wavy, intangible process and bring it back down to earth with some concrete examples.
Abstracts that do it well
Let’s make it real by taking a look at a few abstracts that I and other Program Committee Members thought were particularly good from Kafka Summit 2023 and upcoming Current 2023… And more importantly, why they were good.
The basics
To give you a good starting point, here’s a solid, no-frills abstract on the subject of event-driven architectures.
4 Patterns to Jumpstart your Event-Driven Architecture Journey
The shift from monolithic applications to microservices is anything but easy. Since services usually don't operate in isolation, it's vital to implement proper communication models among them. A crucial aspect in this regard is to avoid tight coupling and numerous point-to-point connections between any two services. One effective approach is to build upon messaging infrastructure as a decoupling element and employ an event-driven application architecture.
During this session, we explore selected event-driven architecture patterns commonly found in the field: the claim-check pattern, the content enricher pattern, the message translator pattern, and the outbox pattern. For each of the four patterns, we look into a live demo scenario based on Apache Kafka and discuss some variations and trade-offs regarding the chosen implementation.
You will walk away with a solid understanding of how the discussed event-driven architecture patterns help you with building robust and decoupled service-to-service communication and how to apply them in your next Apache Kafka-based project.
What it does well
The author uses the opening sentence as a way to connect with the audience over something that most folks can agree on—transitioning between monoliths and microservices is difficult. What’s even better here is that connecting with the people who have gone through this process doesn’t necessarily alienate people who haven’t undergone this process just yet. In my opinion, it’s a solid way to open the abstract.
After introducing the subject of the session, we see more detail on the event-driven patterns that the talk will cover. The author then goes into more specifics about the technical demo that will be a part of the session and the goals of that demo.
Finally, he summarizes the takeaways that the audience should expect—a must IMO.
Thanks to Hans-Peter Grahsl, Developer Advocate at Red Hat, for agreeing to include his abstract in this blog. (Check out the lineup for Current 2023 and consider attending to see this talk delivered live.)
Non-standard subject-matter... or how to make the serious content fun.
Next, we’ll mix things up a bit with an abstract covering a less-standard technical subject. Even so, there are some similarities between this and the previous example, and you might start to see a pattern emerge.
🎶🎵Bo-stream-ian Rhapsody: A Musical Demo of Kafka Connect and Kafka Streams 🎵🎶
You’ve heard of Apache Kafka. You know that real-time event streaming can be a powerful tool to power your project, product, or even company. But beyond storing and relaying messages, what can Kafka do?
In this talk, get an overview of two key components of the Kafka ecosystem beyond just brokers and clients: Kafka Connect, a distributed ingest/export framework, and Kafka Streams, a distributed stream processing library. Learn about the APIs available for developing and deploying a custom source and sink connector, and for bringing up a Streams application to manipulate the data in between them. Through a musical demonstration involving Kafka Connect and Kafka Streams, audio will be recorded, distorted, analyzed, and played back–live and in real time.
Audience members should expect to come away with a good understanding of how to develop Kafka Connect connectors and Kafka Streams applications, as well as some basics of digital signal processing.
What it does well
Before we even get into the abstract, we’re hit with a fun, attention-grabbing title. Who wouldn’t want to attend a talk with a live demo… and a live musical performance? Obviously this has more to do with the subject at hand, but the author could have just as easily used a more technical title without the fun. Making it fun was a stylistic choice. (You should always feel free to add your own stylistic flair to your abstracts; read on for more on this.)
The abstract is then supported with technical details: the talk will go beyond Kafka Brokers and Kafka Clients and instead will focus on the APIs for Kafka Connect and Kafka Streams. He follows this with an outline of the stages of the demo. Cool! Now we know what technical content to expect from the talk!
Then he brings it home with the final paragraph. Even though this talk is a fun repose from the usual, heavy content of a conference, it still has relevant and interesting technical takeaways. Just because you’re presenting something fun doesn’t mean that the audience won’t get something out of attending the session.
This abstract was written by Chris Egerton, Staff Apache Kafka Open Source Developer at Aiven, and is being showcased here with his permission. (If this talk sounds super cool—and it should—considering attending Current 2023 to see what’s sure to be an awesome live performance!)
Adding personality
For our final example, let's consider... what happens when you let your personality shine through in an abstract? Let’s take a look at a session introducing very useful, deeply technical content in an honest, accessible way.
Pragmatic Patterns (and Pitfalls) for Event Streaming in Brownfield Environments
Unlike Greenfield development where everything is new, shiny and smells like cheese sticks most developers must integrate with existing systems, aptly named, brownfield development. Event streaming can be complex enough when starting from scratch; throw in a mainframe, existing synchronous workflows, legacy databases and an assortment of EBCDIC and it can seem impossible. Using Kafka Streams as our library of choice we will discuss patterns that enable successful integration with the solid systems from our storied past, including:
- Translating synchronous workflows to async using completion criteria and the routing slip pattern
- Getting Ghosted: Detecting missing responses from external systems with the Absence of an Event and Thou Shall Not Pass patterns
- Change Data Capture, the Outbox Pattern, Derivative Events and transaction pitfalls
Armed with these pragmatic best practices, you will be able to successfully bring eventing into your stack and avoid turning your brownfield…into a minefield.
What it does well
The opening sentence presents a heavy topic in a comical way, and you see a bit of the author’s personality shine through. What does the smell of cheese sticks have to do with tech? Almost nothing. But it’s fun, it’s attention-grabbing, it sets the tone for the audience.
It gets to the point. After introducing the legacy systems that form the basis of the talk (and connecting with audience members using these systems), the author dives into the technology that they’ll be using and includes examples of the patterns that will be covered.
And again, as expected, she summarizes the talk with a set of takeaways… and ends it with a pun. I love that this abstract finds balance between the technical details that form the foundation of the abstract and having a bit of fun.
This abstract was written by the incredibly talented Anna McDonald, Principal Customer Success Technical Architect at Confluent, and is included here with her permission. (If you want to see the whole talk, she delivered this session at Kafka Summit London 2023.)
Try it yourself
The three examples shown here are different from one another; they each cover a different subject at a different level, and they’re meant to appeal to different audiences. Even so, you may have noticed that they all more or less follow the abstract formula that I introduced in the last blog post on responding to a CfP. You don’t (and shouldn’t!) have to reinvent the wheel each time you want to write a new abstract.
So make it easy for yourself. Take inspiration from great abstracts that you see out in the wild and follow these steps:
- Determine your topic
- Answer the questions, being sure to write for your future audience
- Follow the abstract template
- (Optional) Add a bit of personal flair
If you liked this post and thought these examples were helpful, be sure to check out the rest of the blogs in this series! And, on that note, be sure to keep an eye out for more posts on this topic in the future... 👀
Top comments (0)