webMethods

webMethods

Join this online group to communicate across IBM product users and experts by sharing advice and best practices with peers and staying up to date regarding product enhancements.

 View Only

Migrate and execute existing self-hosted Kafka implementation using DADA capabilities in webMethods.io Integration 

Mon April 21, 2025 01:12 AM

Introduction

This article will explain how the exiting Kafka use case can be migrated and executed on webMethods.io Integration using the DADA (Develop Anywhere and Deploy Anywhere) capabilities.

Pre-requisite

  • webMethods.io Integration tenant with DADA capabilities enabled
  • Kafka Instance.
  • External repository
  • Service designer on self-hosted setup.

Audience

 This article is targeted for technical architects and technical developers of webMethods.

What is DADA?

https://community.ibm.com/community/user//integration/blogs/john-carter/2023/10/20/what-is-develop-anywhere-deploy-anywhere

What is Integration runtimes?

https://community.ibm.com/community/user/integration/blogs/theo-ezell/2025/03/07/develop-anywhere-deploy-anywhere-what-are-integrat

Use case

In this use case we will discuss how we can migrate the Kafka packages from the self hosted setup to the cloud and execute the transaction on webMethods.io Integration by using the DADA capabilities.

  • Client is submitting the request to the exposed flow service on self-hosted Integration server.
  • Inside the flow service the events are getting produced and submitted to Kafka
  • Events are then submitted to “Temperature” topic in Kafka system.
  • In our case we are using the Kafka SaaS setup running on IBM cloud.

Existing use case Architecture

Target Architecture

A diagram of a cloud

AI-generated content may be incorrect.

Steps to achieve the target architecture

  • External repository where the existing self-hosted assets will be pushed to it.
  • Pull the asset from external repo into webMethods.io Integration project.
  • Spin the edge runtime.
  • Sync the assets.
  • Configure the connections and sync the assets  

Push assets to the external repository

  • In our case we have the package “webMethodsDADA”. This package contains the business logic to submit the request to Kafka endpoint system.

         Link: https://github.ibm.com/Vikash-Sharma5/webMethodsDADA

  • We have created the repository in the external repository in our case it is github repository.

A screenshot of a computer

AI-generated content may be incorrect.

Pull the assets from repository

  • Inside the project navigate to package tab.
  • Provide the url and pull the assets.
  • As this package is dependent on Kafka adapter so even this got pulled.

A screenshot of a computer

  • As this package requires connectivity to Kafka system, jars for same are placed in the static folder within the package.

Spin the edge runtime

  • Create the edge runtime where we want to run the packages imported from repository.
  • Navigate to webMethods.io Integration tenant and create the integration runtimes.
  • In our case we want to run the edge runtimes on our local machine, therefore we have installed the rancher desktop where edge runtimes image can be run.

  • In our case we have name the edge runtime as “kafkaedgeruntime”.

  • Execute the above docker command to run the edge runtime.

Orchestration

  • Create the flow service  SendMessageTokafka”.
  • Select the edge runtime “kafkaedgeruntime” created in the last step.
  • Save the service.
  • Click on sync.

Note: When we do sync the flow service then it sync’s the asset from cloud design time (CDT) to edge runtime (ERT).

Configure the connections

  • Click on connectors tab and navigate to deploy Anywhere.

  • In our package we have 2 Kafka connection therefore we can see 2 connections kafkaconsumer and kafkaproducer.
  • Kafkaproducer connection will be used to push the messages into Kafka system.
  • Kafkaconsumer connection will be used to consume the messages from Kafka system.
  • For each connection we need to configure the runtimes  and sync the connections to the runtimes.
  • In our case we are using the edge runtimes therefore we have selected the runtimes “kafkaegderuntime”.

  • Provide the secrets and other details. In our case we are connecting to the same Kafka end points therefore there will be no impact on any other configurations.
  • We only need to provide the  JAAS config.

A screenshot of a computer

AI-generated content may be incorrect.

Sample for JAAS config:

  • org.apache.kafka.common.security.plain.PlainLoginModule required username='YourKafkauser' password='***********’; 
  • After providing the details click on sync. When the sync completes, the credentials will get updated to the ERT and the connection will get enabled.

         Note: When you click on sync, it updates the manifest file with the credentials and other details.

Test the flow service

  • Invoke the flow service.
  • This will submit the message to the Kafka.

A screenshot of a computer

AI-generated content may be incorrect.

Note: In the above use case we have discussed about the Kafka producer where you can submit the events to the Kafka system.

  • In similar fashion we can create the Kafka listener or Kafka consumer where we can consume\listen the messages from Kafka.
  • In our case we have created the Kafka consumer client to manually consume the messages from the Kafka system.

Architecture for consuming events from Kafka

Steps to create flow service to consume the Kafka message

  • For consuming the message from kafka we will use existing code running on our self hosted Integration server.
  •  Create the flow service on webMethods.io Integration and lets name it as “RcvMessageFromKafka”.
  • Import the package from external repository.

  • Now sync the assets and connector in the same way as we have done earlier.

  • Now let’s run the consume Kafka service to get the messages from Kafka system.

A screenshot of a computer

AI-generated content may be incorrect.

Note:

  • Sharing the github repository 

 Link: https://github.ibm.com/Vikash-Sharma5/webMethodsDADA

  •  Attaching the flow service used in our use case.

Statistics
0 Favorited
22 Views
2 Files
0 Shares
1 Downloads
Attachment(s)
zip file
service for pulling the message from kafka   5 KB   1 version
Uploaded - Mon April 21, 2025
zip file
service for pushing the message from kafka   6 KB   1 version
Uploaded - Mon April 21, 2025