The 10.0.3.0-ifix1 release of Event Endpoint Management has been enhanced to support the use of
SCRAM-SHA-512
and
SCRAM-SHA-256
SASL mechanisms when preparing to socialize your Kafka topics. This enhancement allows Event Endpoint Management to be able to authenticate with a wider range a Kafka clusters.
In this blog, I will show you how to create these SASL mechanisms with two popular Kafka offerings,
IBM Event Streams and
Strimzi, and how these credentials can be effortlessly used in IBM Event Endpoint Management. It will also describe the recommend level of authorization to grant your credentials, by using Kafka ACLs, when using Event Endpoint Management.
Prerequisites
This guide assumes that you have already deployed an instance of either IBM Event Streams or Strimzi, that the Kafka broker's server properties have an
authorizer.class.name
defined, and that you have a topic you would like to socialize via Event Endpoint Management. For the purposes of this blog, I have a topic called
weather-data
.
In order to create a credential in IBM Event Streams which can be used with IBM Event Endpoint Management, you must have at least one
listener configured with scram-sha-512
authentication.This guide also assumes you already have deployed an instance of IBM Event Endpoint Management, which has an Event Gateway and Portal service deployed, registered and configured in a Catalog. For information about installation, registration and configuration
see the Event Endpoint Management documentation.Familiarity with Kafka Access Control lists (ACLs) is also assumed. An introduction to Kafka ACLs can be found
on IBM Developer.
Creating a SASL credential
Both IBM Event Streams and Strimzi provide a mechanism to easily create Kafka ACLs. In each solution, you can define what Kafka resources you would like to grant access to, and generate a username and password (credential), which grants access to resources. We will provide this credential to Event Endpoint Management so the Event Gateway Service can authenticate and authorize itself with your Kafka cluster.
The minimum required permissions for a credential used by the Event Gateway Service to connect to and access Kafka events in your cluster is the ability to
Read
your topic, and allow all consumer groups specified by a client using this credential to
Read
as well.
Note: We recommend that a topic and principle-specific 'read-only' credential is created for use with Event Endpoint Management.
Creating a credential in IBM Event Streams
IBM Event Streams offers a simple wizard to create Kafka ACLs for your applications. As IBM Event Streams incorporates Strimzi, these Kafka ACLs are represented as a
KafkaUser. A
KafkaUser is a Kubernetes custom resource managed by Event Streams. When configured, the
KafkaUser creates ACLs in the Kafka cluster managed by Event Streams, and generates a Kubernetes secret containing the credential to use when connecting to your cluster.
We will use this wizard to create a
SCRAM-SHA-512
credential and a
KafkaUser, which can be used with Event Endpoint Management as the application.
Log in to the Event Streams UI, click the
Connect to this cluster tile, and the
Generate SCRAM credentials button next to the listener you want Event Endpoint Management to connect to:
This will launch the wizard to create a credential. Define the name of your credential and the permissions you want to grant (consume only for the purposes of this guide):
Click
Next and specify the topic this credential allows access to. In this example, it's only one topic,
weather-data
:
Click
Next. You could choose to restrict consumer group access, but for now, allow all consumer groups to have access:
And finally, complete the wizard flow by clicking
Generate credentials with no transactional IDs required:
After completing the wizard, a
KafkaUser and a Kubernetes secret are generated, containing the credential which can be used with Event Endpoint Management. The generated username and password will be presented on screen ready for you to use:
Make a note of both of these values, and continue to
creating your API.
Creating a credential in Strimzi
Strimzi provides a
KafkaUser to manage access to Kafka resources. A
KafkaUser is a custom resource, which creates ACLs in the Kafka cluster managed by Strimzi, and generates a Kubernetes secret containing the credential to use when connecting to Kafka. In this case, we will provide the generated credential to Event Endpoint Management.
Go to the OpenShift Container Platform UI, locate your Strimzi operator, and create a
KafkaUser by clicking the
Kafka User tab, and then clicking
Create KafkaUser.
Configure a
KafkaUser to use with Event Endpoint Management.
The
Name you enter will be the username you will provide to Event Endpoint Management when
creating or editing your API.
Note: The only supported Strimzi
Authentication option compatible with Event Endpoint Management is
scram-sha-512
.
Modify the ACLs associated with this
KafkaUser by using the
YAML view to edit the
authorization.acl
configuration:
As shown in the previous screen capture, the initial
KafkaUser is populated with placeholder Kafka ACL configuration to create, describe, read, and write to the topic
my-topic
and consumer group
my-group
. Use the YAML editor to update the configuration to provide
Read
access to my topic,
weather-data
. Also, allow
Read
access to any consumer group, as shown in the following screen capture.
Click
Create to create the
KafkaUser. When the
KafkaUser is created and ready, the following is displayed:
When the
KafkaUser is created, a secret containing the required credentials is also generated. To view the secret, click the name of the
KafkaUser, and then go to the
Resources tab:
The name of the secret is the
Username
you provide to Event Endpoint Management. To retrieve the
Password
for the secret, click the secret, go to the
Data section, and copy the
password
value:
Make a note of both of these values, and continue to
creating your API.
Creating your API
After creating a
SCRAM-SHA-512
credential, you can create a new API in Event Endpoint Management, which can make use of the credential to securely socialize your Kafka topic.
Log in to Event Endpoint Management and create an API by going to the
Develop page, clicking
Add and selecting
AsyncAPI (from Kafka topic). This launches a flow where you can provide details of your Kafka cluster, and the topic you want to socialize.
As a part of this flow, the
Cluster connection security section defines the configuration used by the Event Gateway Service to connect to your Kafka cluster. Having created a
SCRAM-SHA-512
credential, it is in this section where we can now make use of it.
While providing details in the
Cluster connection security section, select either
SASL_PLAINTEXT (if your
Bootstrap Server address does not require TLS) or
SASL_SSL (if your
Bootstrap Server address requires TLS) as your
Security Protocol.
You can then specify the SASL username and password generated previously, as well as
SASL Mechanism these credentials use.
Note: If your Bootstrap Server address requires TLS and it does not use a certificate signed by a well-known certificate authority (CA), you can provide a
.pem
file as the
Transport CA certificate to use when connecting to your Kafka cluster.
Continue through the flow and create your new API. After the API is created and
published to a catalog with a configured Event Gateway Service, any applications making use of this API through the Event Gateway Service will only be able to interact with your cluster and topic as defined in the ACLs you created.
In addition, having created your new API, you can now further restrict the consumer group ACL you created, as described in
restricting the consumer group ACL.
You can also edit new or existing APIs,
including what security credentials are used, when your API is published to an Event Gateway Service.
Important: If you created an API in Event Endpoint Management version 10.0.3.0-ifix1 or earlier, you can use the AsyncAPI editor to update your API to make use of
SCRAM-SHA-512
and
SCRAM-SHA-256
SASL mechanisms.
Restrict the consumer group read ACL
While creating the SASL credential for both Event Streams and Strimzi in this guide, you restricted access to a named topic that could only be read from. We also configured the consumer group access so that any consumer group could also read from Kafka if the client used this credential.
You can restrict consumer group access further so that only consumer groups associated with your Event Endpoint Management API can consume with this credential.
Each AsyncAPI created in Event Endpoint Management has a unique ID called the
Cluster Config ID. You can view this ID in the AsyncAPI editor under
Gateway >
Invoke.
This ID is used by the Event Gateway Service to differentiate between APIs. It is also used to prefix any
group.id
, so clients using your API can be uniquely identified. As this is used as a prefix, you can update the consumer group ACL to only allow consumer groups with this ID as a prefix to read from your Kafka cluster.
Both IBM Event Streams and Strimzi create
KafkaUser resources to represent Kafka ACLs, and you can edit these resources to add this ID as a prefix.
Go to the
KafkaUser you created earlier, and select the
YAML view. Update the
type: group
resource so the
name
is the
Cluster Config ID of your API, the
patternType
is
prefix
, and save your changes.
This will update the ACLs applied to your cluster, while using the same credentials generated previously.
For information about the format of the
group.id
used by the Event Gateway Service when connecting to your cluster, see example about
restricting access for a consumer group.
Summary
This blog shows how to create
SCRAM-SHA-512
SASL SCRAM credentials in two popular Kafka offerings, including recommended Kafka ACLs, and how these credentials can be quickly and easily used by IBM Event Endpoint Management to securely socialize your Kafka topics with application developers. This allows even more Kafka clusters to gain the value provided by socializing and sharing your Kafka topics in a self-service manner, while maintaining a granular and controlled level of access to the cluster and its data.
For more information about IBM Event Endpoint Management, see the
Cloud Pak for Integration documentation.
#API#eventendpointmanagement#events#IBMCloudPakforIntegration(ICP4I)