App Connect

App Connect

Join this online user group to communicate across IBM product users and experts by sharing advice and best practices with peers and staying up to date regarding product enhancements.

 View Only

Introducing Batch Flows in IBM App Connect Enterprise Certified Container

By Cameron Roberts posted Mon March 17, 2025 05:07 AM

  

Introducing batch flows in IBM App Connect Certified Container (ACEcc)

Complete batch flow with several nodes (Salesforce, IBM Cloudant and Slack)
Batch has been a feature available in our as a service offering for a while but we haven't got around to doing the engineering to enable this in our container offering. We have recently prioritised the work to enable this feature and can now announce this is enabled for our container offering.


Batch processing enables you to

  • Load large numbers of records from hosted applications into IBM App Connect Enterprise.
  • Manipulate records individually, using a range of IBM App Connect Enterprise connectors and toolbox nodes.
  • Perform actions when all of the records have been completed using other connector and toolbox nodes.
  • View/Monitor the status of batches during Authoring, and then prepare, and run the batches in a production workload.

Creating a flow that contains a batch process is largely the same as you would in our as a service offering ACEaaS with some minor adjustments. First of all we need to create a configuration with Redis credentials which will be used for persistence.

For the purpose of this demo we're using a Redis instance in IBM Cloud which you can find here

Create Redis configuration


Take the following YAML and populate the values with the credentials from your Redis instance.

accounts:
  rediscache:
    - name: Account 1
      authType: BASIC
      credentials:
        host: abcde123-f4g5-6h7j-9876-klm54.fjfk64yrir57dcc.databases.appdomain.cloud
        port: "30663"
        username: ibm_cloud_d7123456_00b8_239c_f65e_ab1234cd5678
        password: 1234qOlflk567r8uBheda99yzLKVtryuquiu543ythD
        databaseNumber: "0"
        tlsRejectUnauthorized: true
        tlsCa: '"LABCS0tLS13CRUsdJTiBQUklWS0tLS0tCk1JSUV2d0lCQURBTkJna3QVRFIEtFWFoa2lHOXcwQkFRRUZBQVNDQktrd2dnUo=","name":"1234c347-876a-12e9-b1a6-c99ac0347fc5"'
      endpoint: {}

Save that YAML to a file, let's call it redis-credentials.yaml

We now need to base64 encode the redis-credentials.yaml file for use in our Configuration object.

Run the following command and copy the contents

cat redis-credentials.yaml | base64

Add your base64 encoded redis-credentials.yaml into the Data field of the configuration. Below you can see a populated version of a configuration which we can apply to our OpenShift Container Platform (OCP) cluster.

apiVersion: appconnect.ibm.com/v1beta1
kind: Configuration
metadata:
  name: redis
  namespace: ace-cam
spec:
  type: persistencerediscredentials
  data: YWNjb3VudHM6CiAgcmVkaXNjYWNoZToKICAgIC0gbmFtZTogQWNjb3VudCAxCiAgICAgIGF1dGhUeXBlOiBCQVNJQwogICAgICBjcmVkZW50aWFsczoKICAgICAgICBob3N0OiBhYmNkZTEyMy1mNGc1LTZoN2otOTg3Ni1rbG01NC5mamZrNjR5cmlyNTdkY2MuZGF0YWJhc2VzLmFwcGRvbWFpbi5jbG91ZAogICAgICAgIHBvcnQ6ICIzMDY2MyIKICAgICAgICB1c2VybmFtZTogaWJtX2Nsb3VkX2Q3MTIzNDU2XzAwYjhfMjM5Y19mNjVlX2FiMTIzNGNkNTY3OAogICAgICAgIHBhc3N3b3JkOiAxMjM0cU9sZmxrNTY3cjh1QmhlZGE5OXl6TEtWdHJ5dXF1aXU1NDN5dGhECiAgICAgICAgZGF0YWJhc2VOdW1iZXI6ICIwIgogICAgICAgIHRsc1JlamVjdFVuYXV0aG9yaXplZDogdHJ1ZQogICAgICAgIHRsc0NhOiAnIkxBQkNTMHRMUzEzQ1JVc2RKVGlCUVVrbFdTMHRMUzB0Q2sxSlNVVjJkMGxDUVVSQlRrSm5hM1FWUkZJRXRGV0ZvYTJsSE9YY3dRa0ZSUlVaQlFWTkRRa3RyZDJkblVvPSIsIm5hbWUiOiIxMjM0YzM0Ny04NzZhLTEyZTktYjFhNi1jOTlhYzAzNDdmYzUiJwogICAgICBlbmRwb2ludDoge30=

Save the above Configuration to a file named redis-configuration.yaml and swap out the data field for the base64 encoded string you just created.

To apply this to a cluster run

oc apply -f redis-configuration.yaml

Create DesignerAuthoring with batch flows enabled


Now that we've created a Configuration with our Redis credentials we can go ahead and create our instance of DesignerAuthoring with the batch functionality enabled. The YAML for that can be seen below

apiVersion: appconnect.ibm.com/v1beta1
kind: DesignerAuthoring
metadata:
  name: des-01-quickstart-ma
  labels:
    backup.appconnect.ibm.com/component: designerauthoring
  namespace: ace-cam
spec:
  license:
    accept: true
    license: L-KPRV-AUG9NC
    use: AppConnectEnterpriseProduction
  couchdb:
    storage:
      size: 10Gi
      type: persistent-claim
    replicas: 1
  designerMappingAssist:
    incrementalLearning:
      schedule: Every 15 days
    enabled: true
  authentication:
    integrationKeycloak:
      enabled: false
  authorization:
    integrationKeycloak:
      enabled: false
  batch:
    configurations:
      - redis
    enabled: true
  designerFlowsOperationMode: local
  replicas: 1
  version: '13.0'

You can save the above to a file called designer-authoring.yaml

To apply that to a cluster run the following command

oc apply -f designer-authoring.yaml

Create an event-driven flow


Now that we've completed our setup steps we should have an instance of DesignerAuthoring with batch enabled. To grab the URL for the instance run the following command

oc get designerauthoring des-01-quickstart-ma -ojsonpath='{.status.uiUrl}'


From the homepage you can click "Create an event-driven flow to get started

Configure flow with batch node


Note you will need to provide account details for the connectors you wish to use in your flow (this example uses Salesforce, IBM Cloudant and Slack)

Now that you're in the flow editor canvas you can create a flow which contains a batch, you have a range of connectors to choose from so choose whichever is going to satisfy your use case. For the purpose of this blog I'm going to keep it simple. We're going to create a batch process that performs the following actions.

- Get contacts from Salesforce (On 10 minute intervals)
- Create a document in IBM Cloudant with that contact
- Notify of batch process completion via Slack
As you can see from the screenshot above we have now completed our batch flow. Now that we've created our flow we want to turn it on to begin processing records, as we've gone with an event-driven flow this is as simple as just hitting the "Start flow" button.

Output message to confirm batch has finished


Now with any luck if we head over to our instance of slack we'll have a message confirming that our batch proces started and finished, along with a report of how many records we processed successfully.

Additional reading

  • https://community.ibm.com/community/user/integration/blogs/samuel-may/2023/10/04/introducing-batch-flows-in-ibm-app-connect-enterpr
  • https://community.ibm.com/community/user/integration/blogs/paul-thorpe/2024/01/18/deploying-and-monitoring-batch-flows
0 comments
23 views

Permalink