DataPower

Part 1: Kafka Support in DataPower

By Krithika Prakash posted Thu December 10, 2020 05:27 PM

  
Hi, I'm Krithika Prakash - Senior Technical Staff Member (STSM) as IBM APIConnect & DataPower Product Development team.

I presented DataPower & APIConnect Gateway Features at the IBM Americas Hursley Summit held in Nov 2020 and there were follow-up questions on this specific topic about Kafka support in DataPower.  I have covered the details of the support in this article series, which I hope clarifies the questions from our customers and business partners. My full presentation from the summit can be accessed here. 

In this two part article, I covered the following : 
1) In Part 1, I  have given an overview of the main Kafka features supported in DataPower and how these features can be used in a variety of use cases involving Kafka traffic. 
2) In Part 2, I have covered step by step hands on tutorial on how DataPower can be configured for a specific Kafka security use case.

DataPower Gateway Kafka support is available in v10. Documentation can be accessed here

DataPower can be configured as a Kafka client - either as a Kafka consumer, reading /subscribing messages from Kafka topic. Or as a Kafka publisher - pushing messages to Kafka topics. 

DP as a Kafka Consumer:
As a Kafka consumer, the messages are read by using a Kafka Front Side Handler in a Multi-protocol-Gateway in DataPower.  Once the messages are read by a handler in DataPower, the full spectrum of MPGW features are at your disposal to act on these read messages. For example, you can transform the message, add security to it - say sign and or encrypt the message  - pretty much any action/policy that you can use in the processing policy of the DataPower Gateway can be applied here. 

DP as a Kafka Publisher:

As a Kafka publisher, DataPower can be configured to do "url-open" using the format dpkafka://, specifying a topic to which the messages can be published into. Url-open function can be used either in a Gatewayscript within the Processing policy of a MPGW, or directly as  a  "Backend URL" in the MPGW configuration.

By using these features or a combination of them (the consumer/publisher capabilities), you can implement a wide variety of use cases for proxying Kafka traffic.

Scenario 1: Protocol Translation

You can use DataPower to translate to and from Kafka protocol.

- Read using Kafka protocol in the front and send to a HTTP server in the back
(or)
- Read using HTTP in the front, and publish to a Kafka topic in the back


Scenario 2: Secure communication between DataCenters

You can use DataPower to secure connections between two data centers. You can have a Kafka cluster in Data Center 1 along with a DataPower and a similar infrastructure in Data center 2. 

In DC1:
-  You will configure DataPower1 MPGW to read/consume from Kafka topic using MPGW Kafka Front side Handler. 
-  Add security to the message - configure a JWE encrypt action in the processing policy of MPGW
 - Send using HTTPS to DataPower 2 in DC2

In DC2:
- You will configure DataPower2 MPGW to read HTTP using HTTPS Front side Handler.
- Remove security - add a JWE decrypt action in the processing policy of MPGW (Note: the keys between DataPowers in DC1 and DC2 have to be synchronized in order to perform the crypto operations)
- Publish the message to the Kafka topic in DC2.
 
In Part 2 of the article, I have done step by step demo on how to configure DataPower to secure communication between two Data Centers. 

Hope you find this article series useful and let us know if you have any feedback or comments. Thank you !

Permalink

Comments

Wed February 03, 2021 11:55 AM

I have a question when datapower is acting as consumer. So if we have multiple datapower devices which listen to same kafka server then how datapower make sure that same message won't be consumed by both the datapower devices.
I saw there is some option of cosumer group so can this help to solve the above.