You can use the Kafka Producer node to publish messages that are generated from within your message flow to a topic that is hosted on a Kafka server. The published messages are then delivered by the Kafka server to all topic consumers (subscribers). You can use a Kafka Consumer node in a message flow to subscribe to a specified topic on a Kafka server. The Kafka Consumer node then receives messages that are published on the Kafka topic, as input to the message flow. You can also read an individual message published on a Kafka topic, by using a Kafka Read node in a message flow to specify the offset position of the message in the topic partition.
You can also use Kafka Nodes with event streams, based on the details below:
IBM Event Streams for IBM Cloud is a scalable, distributed, high-throughput message bus, which supports a number of client protocols including Kafka. You can use the Kafka Consumer, Kafka Read, and Kafka Producer nodes in IBM App Connect Enterprise to receive messages from and send messages to Event Streams.
Before you can connect to Event Streams, you must create a set of credentials, which the IBM App Connect Enterprise Kafka nodes can then use to make a connection. You can use either the mqsisetdbparms or mqsicredentials command to configure the credentials that the Kafka nodes use to authenticate to Event Streams.
To enable the Kafka nodes to authenticate by using the username and password, you must set the Security protocol property on the node to SASL_SSL.