View Only

Unlock events in mission critical systems

By Amy McCormick posted Thu June 29, 2023 08:03 AM


IBM MQ provides scalable, resilient and secure connectivity for mission critical systems and applications within and across hybrid and multi-cloud environments. Data from these core systems must flow uninterrupted around your organization to help ensure seamless customer experiences and support strategic processes that are critical to business success. But what if you could tap into that data to detect new trends, customer issues or competitive threats as they arise? This could enable your business to act on critical situations and trigger automations in real-time, without disrupting your existing systems. With IBM MQ you can tap into your existing mission-critical data as it flows around the enterprise with no interruption or architectural changes needed.

IBM recently announced the release of IBM MQ version 9.3.3 Continuous Delivery, which will be generally available June 22, 2023. In the v9.3.3 release, IBM MQ Advanced software for hybrid cloud (distributed platforms) or IBM MQ Advanced for z/OS VUE now includes fully IBM-supported IBM MQ Sink and Source connectors for Apache Kafka so that events from enterprise data can be captured in IBM MQ and delivered seamlessly into an Apache Kafka event streaming platform.

We are often asked about Apache Kafka and MQ by our clients – particularly when to use each solution, whether one could be used for all messaging needs across a business, or if they really need both. The answer is that it often depends on your business objectives and in many cases, MQ and Apache Kafka can deliver the most value for your business when they are together/in conjunction. In this article on IBM Developer, we explain common use cases for both enterprise messaging and Apache Kafka stream processing to show where each excel individually and where there are commonalities. But in short:

  • IBM MQ is best for systems that rely on messages and events to communicate between services in real-time, not just within single applications, but across and between organizations. This conversational messaging often requires that messages are assured for delivery once and only once as duplication or loss of messages may cause significant issues in downstream applications or with a business’s customers – for instance a duplicated payment authorization, inventory levels being incorrect due to missing orders, or patient data not being updated correctly. IBM MQ’s focus on data integrity means that it is ideally suited for such high levels of delivery. Often, the processing being handled by IBM MQ represents exactly the type of business dynamic that is interesting from an event perspective. We want enterprise messaging to process orders, update inventory levels, or modify customer details, and so on.
  • Apache Kafka is best suited to the storage and processing of streams of messages that represent past events. Kafka’s event delivery style means that applications can process these messages as sequences, enabling rapid detection of recent events so we can factor them into our decision making, identifying an opportunity or a threat, and taking the next best action.

With the two technologies providing best of breed solutions for two very different primary objectives, you will find that both are critical to your success, and these two solutions together form an incredibly powerful messaging architecture for businesses. It also means that you will find many uses for the two technologies to work together. For example, you can gain insights from critical data flowing across your business, detecting business situations to act and automate when it matters the most.

Using the IBM MQ Kafka Connectors

Two connectors exist for IBM MQ – the Source connector which moves data into Apache Kafka, and the Sink connector which moves data from Apache Kafka into IBM MQ. As we are looking at unlocking events in MQ data, we’ll focus on the Source connector.

The source connector takes messages from an IBM MQ queue and sends them to a Kafka topic. There are three different approaches for getting messages from Applications onto the MQ queue so that they can be delivered to Apache Kafka: 

Direct to queue - New MQ applications, or applications that are intentionally targeting Kafka applications, can put messages to a queue used by the source connector.

Subscribe to topic - If messages are already being published to a topic, and the consumers are expanding to include Kafka applications, it’s simple in MQ create a new subscription to generate another copy for a queue used by the source connector.

Streaming Queue – If messages are currently being sent and consumed by an application on a queue, and you need an additional copy of that message to flow to a Kafka topic, you can use a streaming queue to create a copy of every message arriving on one queue, onto a second queue that is used by the source connector. Enabling streaming queues has no effects on existing applications as the original message doesn’t change - the message sent to the second queue is identical to the original message: same payload, message & correlation ID, etc. 

Acting on events

When it comes to taking actions on events, IBM recently announced IBM Event Automation which provides an intuitive and integrated experience for distributing, discovering, and processing business events across the organization. It enables you to collect raw streams of real-time business events not just from IBM MQ but from multiple sources across a business with enterprise-grade Apache Kafka, build a self-service catalog of event sources for users to securely browse and utilize, and define business situations in an intuitive authoring canvas so you can respond when they arise and automate decisions.

IBM Event Automation will be generally available June 29, 2023. For more information, please visit the IBM Event Automation website or contact your local IBM representative or IBM Business Partner. Alternatively, join us for our webinar and discover how to Accelerate your speed of business with IBM Event Automation (replay).

For further information and to get started with the Sink and Source connectors to deliver data seamlessly into an Apache Kafka event streaming platform, visit