Welcome to the IBM TechXchange Community, a place to collaborate, share knowledge, & support one another in everyday challenges. Connect with your fellow members through forums, blogs, files, & face-to-face networking.
Join / sign up
We are very excited to announce IBM® Z Monitoring Suite 1.4, IBM Z Service Management Suite 2.4, IBM Z Service Automation Suite 1.6, IBM Tivoli® Management Services on z/OS® 6.3.2 and new IBM zSystems® Integration for Observability 6.1 releases today, 24 th March 2023 , delivering...
4 Comments - no search term matches found in comments.
Configuring IBM App Connect Enterprise to produce or consume messages from Kafka topics in IBM Event Streams requires careful configuration. In this post, I’ll share the steps I use that help me to avoid missing any required values. To illustrate this, I’ll create a simple App Connect flow...
You can use the K afka Producer node to publish messages that are generated from within your message flow to a topic that is hosted on a Kafka server. The published messages are then delivered by the Kafka server to all topic consumers (subscribers). You can use a Kafka Consumer node ...
In this post, I’ll describe how to use App Connect Enterprise to process Kafka messages that were serialized to a stream of bytes using Apache Avro schemas. Background Best practice when using Apache Kafka is to define Apache Avro schemas with a definition of the structure of your...
Alan Chatt, Dale Lane Managing event interfaces in the same way we manage APIs is a concept we've been thinking about for a while and with the first release of Event Endpoint Management last month we took the first step towards making this a reality. To introduce the concept of...
MirrorMaker 2 is the Apache Kafka tool for replicating data across two Kafka clusters. You can use it to make a copy of messages on your Kafka cluster to a remote Kafka cluster running on a different data centre, and keep that copy up to date in the background. In this post, I describe two...
In part 1 of this series, “ Good Integration Patterns Never Die – You Just Add More ”, I described several challenges and identified a set of integration capabilities that solved these challenges. Toward the end of the article, this question was posed: …we identified the following...
IT history can be told as a story of overcoming challenges. A constraint limits IT capability (e.g. processor speed, memory, storage, network, etc.) and then there is a step forward in IT technology to break through the constraint and move forward. Until we hit the next constraint. Then –...
Published on February 24, 2019 IBM Event Streams Kafka Hub and MQ Hosted integration with the Kafka MQ Connector In this new article we’ll explore the integration of IBM MQ and IBM Event Streams leveraging the Kafka MQ Source Connector to take MQ Messages from MQ...
The sample notebooks repository on GitHub contains examples of Streams Python applications that run in Cloud Pak for Data. The samples include examples on how to: Compute a rolling average on streaming data Connect to Db2 Warehouse, Db2 Event Store, IBM Event Streams, Apache Kafka, and...