DEV Community

Cover image for EventBridge Pipes-Simplify Integration
Yogesh Sharma for AWS Community Builders

Posted on • Updated on

EventBridge Pipes-Simplify Integration

As we are in mid of 2023, the AWS serverless ecosystem continues to evolve rapidly, offering a plethora of powerful services and tools to help developers build scalable, efficient, and cost-effective applications. With each passing year or I'd say month, AWS brings forth exciting announcements and launches, further expanding the realm of possibilities for serverless architectures. In this blog post, we will explore some of the coolest recent (as its not humanly possible to cover even 10% in single blog :P) AWS serverless services and stacks that emerged from AWS Reinvent 2022 and are poised to revolutionize application development in the year ahead. Whether you're a seasoned serverless enthusiast or just getting started, these services are worth exploring to unlock new levels of productivity, scalability, and innovation in your cloud-based projects. So, let's get started

EventBridge Pipes: Embrace The Power To Integrate Across Platforms with Ease

The data becomes more complicated as environments and applications develop as well. The effective and efficient management of events can quickly become very difficult due to the expanding ecosystem of loosely connected distributed systems. AWS offers a standard paradigm that enables clients to construct architectures using the right services from its catalogue in order to achieve a business objective.
We already know many patterns like-

  • DynamoDB stream that pushes an event on AWS EventBridge to perform some action
  • SQS/SNS trigger Step Function

Pipes

Therefore, we may use DynamoDB Streams directly with AWS SQS rather than now utilizing AWS Lambda to chain DynamoDB stream and SQS. Additionally, we have the choice to perform enrichment and filtering on the source event before it reaches the target.

Problem before EventBridge Pipes

If we needed DynamoDB Streams to deliver messages to SNS depending on changes made to your DynamoDB table, we had to construct a Lambda function between DynamoDB and SNS.
Similarly, before sending the DynamoDB Stream events to the Eventbus, we must filter out the meaningful events and convert into domain events, which serve as the payload for other services.

In short, filter patter or some other integration code logic is frequently needed to connect the producer and consumer services when builders want to create a complicated architecture with a number of services. Also, builders need to take care of batching events, handling errors, throughput, performance etc. And that where EventBridge Pipes comes into picture that was launched by AWS in re:invent 2022.

Eventbridge Pipes to simplify integration

EventBridge Pipes is a service to connect sources to targets. Its basically "Integrate Across Platforms with Ease". When creating event-driven architectures, it reduces the requirement for specialised knowledge and integration code. When setting up a pipe, select the source, include optional filtering, provide optional enrichment or transformation, and select the target for the event data.
Pipes provide simple, consistent, cost-effective way to create point to point integrations between event producers and consumers.
Main components of Eventbridge Pipes:

  1. Source- Define initial input to the pipeline. As of now, it allows to use DynamoDB Streams, SQS, Kinesis Stream, Kafka or MQ.
  2. Filter- Define and filter events using Eventbridge syntax (optional stage).
  3. Enrich- Allow to enrich the events by calling API destination, Lambda function or a synchronous Step Functions state machine, Amazon API Gateway (optional stage). Also, you can perform input transformation if need to transform the input events.
  4. Target- Define destination of all events. As of now, it supports 15 destinations

See in action-

1. For loosely coupled components-

We are in an event driven architecture, distributed architecture, microservice architecture. Everybody's independent from each other. We want to have that and stay like that. We don't want to make architectural decisions based on others. We are loosely couple, but we need to talk to each. So how we do that, we can use pipes. Basically you can use pipes and connect these two loosely couple components together. For example, one way to do it is if you have a component, you can basically add a queue in front of that component with pipe. So you can add an SQS and then attach a pipe and then send the messages to that other component. And in that way you create a buffer and you can basically take the events in more kind of consistent way.
Loosely coupled pattern
If two different microservices are already emitting events. Now, if one is emitting it to Kinesis and the other one is emitting it to Event Bridge. Now you can use Pipe to connect them together. So this idea of loose couple components that they're very independent from each other now can be solved either by putting a queue in front where the other event is other services, sending events into that queue, or then connecting these two event sources together. The cool thing with pipes and with this type of solution that this is low code. There you will see that there is a little bit of infrastructure built in, but you don't need to change neither the event producer or the event consumer in order for this to work. So your two systems are not changed and we like that.

How Pipes helps-

  • No custom code needed
  • Pattern suitable for various targets
  • Filtering & batching possible, reduce cost
  • Automatic retry in case of error
2. Content Filter Pattern

When processing data, some components only need to process a subset of the original data. Sometimes, for regulatory reasons, they are even forbidden to have access to the complete upstream data
The pattern filters out unwanted and/or unneeded

  • messages as well as
  • attributes

This pattern is a perfect example for the usecase that involves to remove personally identified information (PII).
Content filter pattern

How Pipes helps here-

  • Pattern suitable for various sources & targets
  • More cost effective than using AWS Lambda
  • The pipe encapsulates the whole filtering, hiding this complexity from the rest of the system
  • The pipe can also filter messages that don't need to be forwarded, saving costs

Conclusion

EventBridge Pipes are really powerful and allow you to enrich the content that you receive and perform transformations prior to sending it to the supported targets.
Use Enrichments and filter stage for advanced transformation. But, for simple transformations, Pipes supports built-in input manipulation with a configuration applied during the pipe creation.
Use Pipes to standardize how you want different modules to talk to each other.
That's All Folks!!

Top comments (0)