Configuring IBM App Connect Enterprise to produce or consume messages from Kafka topics in IBM Event Streams requires careful configuration. In this post, I’ll share the steps I use that help me to avoid missing any required values.
To illustrate this, I’ll create a simple App Connect flow that implements a REST API, where any data I POST to the REST API is sent to a Kafka topic.
The key to getting this to work correctly first time is to make sure that values are accurately copied from Event Streams to App Connect.
To help with this, I use a grid like the one below.
The instructions in this post start with Event Streams, and explain how to populate the grid with the information you need.
Then the instructions will switch to App Connect, and explain how to use the values in the grid to set up your App Connect flow.
What this is
Values you will see in my screenshots
||Policy project name
||Security identity name
||Truststore identity name
Note: To see screenshots in more detail, you can click on them to open a higher-resolution version.
There are some things you will need before you can start following the instructions. This post is already long enough, so I will skip any detail on these steps.
You need an OpenShift cluster.
You need to install the Operators for App Connect and Event Streams.
You need to create a secret with your key for the Entitled Registry.
You need an Event Streams cluster.
You need a topic that you want App Connect to send messages to.
Collecting values you need from Event Streams
A – the topic name
Get the name of the topic you want Event Streams to send messages to.
Fill this in the grid as value A.
B – the bootstrap address
Get the bootstrap address that you want App Connect to use to connect to the Kafka cluster.
The addresses you have to choose from will depend on how you have configured your Event Streams instance.
If you have enabled external listeners, you could choose an external address.
If you have enabled internal listeners, you could choose an internal address.
Whichever address you choose, fill it in the grid as value B.
C – security mechanism
Look next to the bootstrap address that you chose for value B.
If you see a reference to “SCRAM”, that means the Kafka listener you have chosen to use for App Connect is configured to require authentication, with credentials provided using the SASL/SCRAM mechanism.
If that is the case, fill
SCRAM-SHA-512 in the grid as value C.
If you see a reference that credentials aren’t required, that means the Kafka listener you have chosen to use for App Connect is configured to not require authentication.
If that is the case, leave value C empty.
D – security config
If value C is
SCRAM-SHA-512, then set value D in the grid to
Otherwise, leave value D empty.
E – security protocol
You need to identify whether the Kafka listener is configured to require encryption.
If you chose an external listener, this is simple – encryption is always required for external listeners.
If you chose an internal listener, you will need to find the spec for the listener. One way to do this is to look at the spec for your Event Streams cluster in the OpenShift Console.
Find the listener that you chose in
type of the listener you chose is
external then encryption is required.
Otherwise (if the
plain), then encryption is not required.
Use the table below to work out what you need to fill in the grid as value E.
| if value C is…?
|| is encryption required?
|| then set value E to…
F – SSL certificate
If the Kafka listener is configured to require encryption, you need to download the CA certificate for the listener.
You can download the PKCS12 certificate from Event Streams.
Keep this file safe, and make a note of the file name in the grid as value F.
G – SSL certificate password
When you download the PKCS12 certificate, the password will be displayed.
Fill it in the grid as value G.
H/I – username / password
If the Kafka listener is configured to require SCRAM credentials, you need to create a username and password for App Connect to use.
(If no credentials are required, you can skip this step and leave values H and I empty.)
Choose a username. Enter it into the grid as value H.
Click on the Generate SCRAM credentials button.
Use the name from value H as the name for your credentials.
Make sure you include the permission that App Connect will need (“consume” if you want App Connect to be able to receive messages, “produce” if you want App Connect to be able to send messages).
Use the topic name from value A when specifying the permissions for the credentials.
Enter the generated password into the grid as value I.
Step 1(b) – Workaround step
At the time of writing, if you have a p12 file in value E, you will need to convert it to a JKS file because of the issue described in the Event Streams support docs. Follow the instructions on that page to create the JKS file.
If you do this, update value F in the grid with the new jks file name.
Step 2 – Choose some App Connect names
You will create some resources in App Connect.
Choose a name for your App Connect policy project.
Enter it into the grid as value J.
Choose a name for your App Connect policy.
Enter it into the grid as value K.
If you have a SCRAM username/password, choose a name for the security identity.
Enter it into the grid as value L.
If you have a truststore file (p12 file, or jks file, depending on whether the workaround is still required), choose a name for the truststore identity.
Enter it into the grid as value M.
Step 3 – Creating your App Connect policy
Use the App Connect Enterprise toolkit to create a new Policy project.
Use the name from value J in the grid for the name.
Create a policy in your new project. Use value K from the grid as the file name.
Make sure that the policy name matches value K from the grid.
Set the policy Type and Template both to “Kafka”.
Fill in the rest of the policy using values from the grid.
Set Bootstrap servers (
<bootstrapServers>) to value B from the grid.
Set Security protocol (
<securityProtocol>) to value E from the grid.
Set SASL Mechanism (
<saslMechanism>) to value C from the grid.
Set Security identity (
<securityIdentity>) to value L from the grid.
Set SASL config (
<saslConfig>) to value D from the grid.
If you have a filename in value F in the grid, set SSL truststore location (
/home/aceuser/truststores/ followed by your filename.
Otherwise, leave SSL truststore location blank.
If you have a filename in value F in the grid, set SSL truststore type (
JKS (if you have a jks file) or
PKCS12 (if you have a p12 file).
Set SSL truststore security identity
<sslTruststoreSecurityIdentity>) to value M from the grid.
Set SSL certificate hostname checking
A few examples of how this could look, depending on some of the choices you could have made…
Export the policy project to a zip file.
The file name you use isn’t significant, so choose any name that you like.
Step 4 – Creating your App Connect flow
To illustrate how to use the grid, I’ll create a flow that sends data received over HTTP to the Kafka topic. You could configure a Kafka consumer node in a similar way.
I’m using an HTTP input node with a path of
Configure the Kafka node, starting with the “Basic” tab.
Set Topic name to value A from the grid.
The Bootstrap servers value won’t be used, but it’s a required value, so put any value in there. I use “not-used” for this to avoid confusion.
Set Client ID to something that can be used to identify App Connect in Event Streams monitoring.
Next, fill in the “Security” tab.
Set Security identity to value L from the grid.
Set Security protocol to value E from the grid.
Finally, fill in the “Policy” tab.
Set Policy using value J and value K from the grid. It should look like
Export the app with your flow to a BAR file.
Step 5 – Set up App Connect
Create an App Connect dashboard
This will make it easier to deploy your App Connect flow.
Add the Configurations to the dashboard
Use the Dashboard to create Configurations.
If you have a filename in value F in the grid, create a new Configuration.
Set the Type to “Truststore” and upload your truststore file.
Create another new Configuration.
Set the Type to “Policy project” and upload your exported policy project zip file.
If you have values for any of value H, value I, value F in the grid, create another new Configuration.
Set the Type to “setdbparms.txt”.
If you have values for value H and value I, add a line with
kafka:: followed by the security identity name, then a space, then the username and password.
kafka::valueL valueH valueI
If you have a filename in value F in the grid, add a line with
truststore:: followed by the truststore identity name, then a space, then an unused placeholder value, then the truststore password.
truststore::valueM notused valueG
Depending on how your Event Streams listener is configured, you should now have between one and three configurations.
Upload the BAR file to the dashboard
Use the Dashboard to upload a bar file.
Import the bar file that you created with the Kafka message flow.
Step 6 – Deploy the message flow
Use the Dashboard to create a new integration server.
If you are using a truststore, note that creating an integration server with CPU and memory limits that are too small can result in SSL handshake errors when connecting to Kafka. If this happens, try different CPU and memory limits. For example, setting CPU limit to at least 1 and memory to at least 512Mi may help.
Choose your BAR file
Enable all of the Configurations that you created
Name the integration and click Create.
Step 6(b) – a gotcha
If you are using an internal Kafka listener, and your App Connect integration server is running in a different namespace to the Event Streams cluster, then you may need to create a NetworkPolicy to give the integration server permission to make the connection to a different namespace.
Step 7 – Try it out!
curl to send some text to the message flow.
You should see the text appear in a message on your Kafka topic.
Finished! Using a grid like the one detailed here should hopefully ensure that you have it working first time, without a lot of time-consuming errors and debugging.#AppConnect#AppConnectEnterprise(ACE)#Integration#Automation#Openshift#cluster#eventstreams#kafka