written with Chris Patmore (Software Developer, IBM Event Automation)
IBM Event Automation helps companies to accelerate their event-driven projects wherever businesses are on their journey. It provides multiple components (Event Streams, Event Endpoint Management, and Event Processing) which together lay the foundation of an event-driven architecture that can unlock the value of the streams of events that businesses have.
A key goal of Event Automation is to be composable. The three components can be used together, or they can each be used to extend and enhance an existing event-driven deployment.
Amazon MSK (Managed Streaming for Kafka) is a hosted, managed Kafka service available in Amazon Web Services. If a business has started their event-driven journey using MSK, then components from Event Automation can help to enhance this. This could be by offering management and governance of their MSK topics. And it could be by providing an intuitive low-code authoring canvas to process the events on their MSK topics.
Working with Amazon MSK is a nice example of the benefits of the composability of Event Automation, by helping businesses to get more value from their existing MSK topics.
In this blog post, we want to show a few different examples of where this can be done. For each example, we'll provide a high-level diagram and description. We'll also share a demonstration that we created to show it in action.
To start with, we demonstrated how Event Processing can be used with Amazon MSK. We showed how Event Processing, based on Apache Flink, can help businesses to identify insights from the events on their MSK topics through an easy-to-use authoring canvas.
The diagram above is a simplified description of what we created. We created an MSK cluster in AWS, set up a few topics, and then started a demonstration app producing a stream of events to them. This gave us a simplified example of a live MSK cluster.
We then accessed this Amazon MSK cluster from an instance of Event Processing (that was running in a Red Hat OpenShift cluster in IBM Cloud). We used Event Processing to create a range of stateful stream processing flows.
This showed how the events on MSK topics can be processed where they are, without requiring an instance of Event Streams or for topics to be mirrored into a Kafka cluster also running in OpenShift. Using the low-code authoring canvas with Kafka topics that you already have, wherever they are, is a fantastic event-driven architecture enabler.
Demo: step-by-step instructions and screenshots of how we built the Event Processing demo.
Next, we demonstrated the value that Event Endpoint Management can bring to an Amazon MSK cluster. We showed how adding MSK topics to a self-service catalog enables sharing and reuse of existing topics, wherever they are hosted. And we showed the way that the addition of an Event Gateway can maintain control and governance of these topics as they are shared.
The diagram above is a simplified description of what we created. We used the same MSK cluster in AWS that we had used for the previous demo, as it already had a variety of topics and a data generator producing live streams of events to them. This time we used it with an instance of Event Endpoint Management (that was running in our Red Hat OpenShift cluster in IBM Cloud).
We added our Amazon MSK topics to the catalog, and configured an Event Gateway to govern access to them. We could have run the Event Gateway in OpenShift, alongside the endpoint manager. However, for this demonstration, we wanted to show the flexibility of running the Event Gateway in the same environment as a Kafka cluster. This showed how you can remove the need for egress from the AWS environment where your Kafka applications are also running in AWS.
Finally, we showed all of this in action by running a Kafka consumer, consuming events from the MSK topics. The consumer was using credentials created in the Event Endpoint Management catalog and connected via the Event Gateway.
Demo: step-by-step instructions and screenshots of how we built the Event Endpoint Management demo.
Finally, we demonstrated how these could be combined, bringing the value of both the previous demonstrations together.
Making Amazon MSK topics available through a self-service catalog can enable much wider reuse of these streams of events. And providing a low-code authoring canvas for processing these events can extend this use beyond just developers, enabling both business and IT teams to define the scenarios they need to respond to.
For this final demonstration, we again used the same Amazon MSK cluster, with the same range of topics and live streams of events as before. We had already added these to the Event Endpoint Management catalog for the previous demo, so for this demonstration we showed how MSK topics found in the catalog can easily be used in Event Processing to quickly identify new real-time insights.
Demo: step-by-step instructions and screenshots of how we built the complete Event Automation demo.
Our goal with this blog post was to demonstrate what can be done with IBM Event Automation, with a particular focus on the benefits of composability. By taking advantage of the de-facto standard nature of the Kafka protocol, we can layer additional capabilities on top of Apache Kafka clusters, wherever they are running.
Our demonstrations were intended to provide an illustrative example of using Event Automation with MSK. It was absolutely not meant to be a description of how to use Amazon services in a perfect or optimum way, but instead focused on a quick and simple way to show what is possible. We wanted to inspire you for how you could get more out of your own Amazon MSK cluster.
For more information about any of the ideas that we have shared here, please see the Event Automation documentation, or get in touch with us.