App Connect

 View Only

Enhancements in the Toolkit for IBM App Connect Enterprise 13

By Sanjay Nagchowdhury posted 15 days ago

  

Enhancements in the Toolkit for IBM App Connect Enterprise 13

IBM App Connect Enterprise 13.0.1.0 was made available on 27th September 2024. 

This article describes the enhancements that have been added to the Toolkit.

Eclipse and JRE upgrade

The level of Eclipse that is used by the Toolkit has been upgraded to Eclipse 4.31.

Eclipse



The previous Eclipse level was 4.23 which was used in IBM App Connect Enterprise 12.0.12.0. Any third-party Eclipse plugins that you may use, will need to be compatible with Eclipse 4.31.

The Java Runtime Environment level that is used by the Toolkit has been upgraded to use the default level of Java 17.

Java Runtime



If necessary, you can revert to using Java 1.8 if your Integration Solution has flows which contain Java Compute Nodes with code that does not run using Java 17.

New Discovery Connector Nodes

In 13.0.1.0, we have added 30 new Discovery Connector nodes:

  • 15 New Discovery Input Message Flow Nodes:
    • Businessmap, ClickSend, Eventbrite, Front, Greenhouse, IBM Maximo, IBM Targetprocess, Magento, Marketo, Slack, Toggl Track, Wrike, Zoho Books, Zoho CRM, Zoho Recruit.

  • 15 New Discovery Request Message Flow Nodes:
    • Businessmap, ClickSend, Crystal Ball, Factorial HR, Front, Hunter, IBM Targetprocess, IBM watsonx.ai, Infobip, Toggl Track, Wrike, Zoho Books, Zoho CRM, Zoho Inventory, Zoho Recruit.

This brings the total to 158 Discovery Connector Nodes. This is in addition to 75 other Connector nodes and 38 toolbox nodes.

Discovery Connector Nodes


Credentials Management

In IBM App Connect Enterprise v12, the wizard to create and start an Integration Server (TEST_SERVER) was extended to allow you to specify that the Integration Server should use an External Directory Vault or an Integration Server Vault.

v12 wizard

 

In IBM App Connect Enterprise v13, the Toolkit has been further enhanced so that credentials can be viewed and managed in an External Directory Vault or in an Integration Server vault.

The credentials can be created, retrieved, updated, and deleted for these vaults:

  •  External Directory Vault or,
  •  Integration Server vault or,
  •  Integration Node-owned Integration Server vault.

A third section has been added under Integration Servers and Integration Nodes, called ‘External Directory Vaults’ in the Integration Explorer view. In this view, you can either connect to an existing External Directory Vault or create a new External Directory Vault in your workspace or in an external location on disk. A number of External Directory Vaults can be accessed. Your preferred External Directory Vault can be stored in the Eclipse Preferences, so that if you need to restart the Toolkit, the vault is accessed automatically using the stored password.

External Directory Vault


The vault location is shown under External Directory Vaults for each vault. Under each vault, the credentials are listed, grouped by credential type. When you click on each credential, the credential properties are shown. Any properties that are secret fields are not revealed in the Value column.

Credentials

 

If you right click on the vault or on a credential type, you can select an option to Create a credential.

Create a credential

 

A window is shown where you specify the credential name, credential type and authentication type. There are many credential types, so you can start typing letters, to reduce the number of credential types to select from.

When entering properties which are secret fields, the value that you are typing is not shown by default. If you wish to see what you are typing, then you can select the ‘Show password while typing’ checkbox so that you can see the value that you are typing. When you move to the next property, the secret field that you were typing is hidden again. The Finish button is enabled after all required fields have been supplied.

Finish button enabled

 

If you right-click on an individual credential, you can choose a menu option to update or delete the credential.

update or delete credential

 

As well as accessing credentials under an External Directory Vault, you can also manage credentials that are in an Integration Server Vault.

An Integration Server can use several different types of Credential Providers. These include:

  • mqsisetdbparms credentials.
  • Server credentials.
  • Integration Server vault.
  • Integration Node vault.
  • External Directory vault.
  • External credential provider.

Every Integration Server that is connected to, now shows a ‘Credentials’ child which is a peer of the other deployed artefacts like Applications and Policy Projects. Each Credential Provider that is being used by the Integration Server will be shown in the Properties view when you click on ‘Credentials’.

Credential Providers

 

Credentials which have a dynamic credential type can be updated for local or remote Integration Servers. Credentials which have a static credential type can be viewed but not updated if they are being accessed for a remote Integration Server.

Credentials which have a static credential type which are being accessed for a Integration Server that has been started from the Toolkit can be updated, but will require the Integration Server to be restarted before the credential can be updated.

Below is an example of updating the sftp credential type which is static which shows that it requires an Integration Server restart. This can only be done if the Integration Server was created and started locally from the Toolkit. It cannot be done for a remote Integration Server.

Update credential

 

Patterns Gallery

New in 13.0.1.0, a new restyled Patterns Gallery has been added which has a very similar look and feel to the Tutorials Gallery. At first glance Patterns and Tutorials sometimes seem similar. Both are helpful, for new users or those who would like to get a solution up and running fast by starting off from a working example.

A Tutorial provides a fixed set of resources to show a running example.
A Pattern presents you a set of choices which allow the generated resources to be customized to fit the way you want to use them.

Patterns are reusable solutions that encapsulate a tested approach to solving a common architecture, design, or deployment task in a particular context. Patterns are helpful because they:

  • Generate customized solutions to a recurring integration problem in an efficient way.
  • Encourage adoption of preferred techniques in message flow design.
  • Help guide developers who are new to the product.
  • Provide consistency in the generated resources.

Patterns have been categorized as:

  • Format Transformation Patterns.
  • Protocol Transformation Patterns.
  • Enterprise Integration Patterns.
  • Messaging Patterns.
  • Scatter-Gather Patterns.

Using the Patterns Gallery, you can filter down the list of patterns by selecting a tag and/or selecting a category.

Patterns

 

After selecting a pattern, an overview of the message flow will be displayed with a brief description.

message flow overview

 

Click on the Start button to install the pattern and then configure an instance of it so that it shows in your workspace.

You will need to select the button trust the bundle which will be installed.

trust the bundle

 

After the pattern has been installed, you will be shown a screen to create a new instance of the pattern in your workspace. Click on the ‘Create New Instance’ button.

Create new instance

 

Give the pattern instance a name. In this example, I have specified a name of ‘JSONtoXMLApp’.

Pattern instance name

 

A set of configuration steps will now be shown which guides you through configurations which will be done on nodes in the message flow. Change the values as required and click on the Next button after each configuration.

Configuration steps

 

After you reach the end, the Generate button is enabled.

Generate button displayed

 

The pattern instance is added to your workspace and instructions are shown for how to test the generated pattern instance.

Instructions to test

 

JSONata Mapping node

A JSONata Mapping node has been added to facilitate generating a JSON message to send to a backend, for example a file, MQ Queue or perhaps as a response message for a REST API. The node conveniently uses the same JSONata Mapping that is available in the Discovery Connector wizard so that it is familiar to use. JSONata is a lightweight query and transformation language specifically designed for interacting with JSON data. It has built in-operators and functions for manipulating and combining data. You can apply JSONata mapping to fields in the message that is being built. The JSONata Mapping node requires JSON schemas in order to be used.

Consider this flow which uses a GitHub Request node to retrieve a specific issue number that is specified in an input message sent to the HTTP Input node:

Flow

 

If a message like this is sent into the flow:

message


The output from the GitHub Request node contains a very large amount of JSON data:

JSON data

 

The JSONata Mapping node can be used to choose specific parts of the output to be sent to the HTTP Reply node.

There are 3 things that are needed:

  1. A schema for the input data so that mappings can be done.
  2. Add the schema to the Map Inputs table on the JSONata node.
  3. Configure the mappings in the JSONata mapping window.
3 requirements

In our example, we are using the GitHub Request node to retrieve details of an issue. When the node is configured using the Discovery Connector wizard, a response schema is stored in the workspace in the same Application project as the message flow.

response schema saved

 

The response schema can be used for the JSONata mapping. To do that, the schema must be added to the Map Inputs table of the JSONata Mapping node.

add schema

 

After adding the schema to the Map Inputs table, the fields in the schema can be used for mapping.

schema used for mapping

 

As a result, specific fields can be selected so that a trimmed down JSON message can be returned by the message flow.

returned JSON message

 

message

 

ESQL Content-Assist Improvements

While using the ESQL Editor for the Compute node, you can now access parts of the LocalEnvironment tree using Control^Space, as shown in the screenshots below.

ESQL editor

 

ESQL editor

 

In addition, content-assist is now available in ESQL for JSON Schemas. The content-assist can differentiate between different JSON schemas. It can also use JSON schemas that are in a referenced shared library.

Content-assist

 

Generate testcase for a node in a subflow

Improvements have been made to the unit test generation wizard. If you are in a recorded subflow using the Flow Exerciser, you can now select a node in the subflow and select the menu to generate a testcase for the node.

The generated testcase will contain code which references the node in the subflow.

Generated test case

 

Disconnect option

A ‘Disconnect’ menu option is now available in V13 for remote Integration Server, remote Integration Nodes and External Directory Vaults. The Server, Node or Vault remains in the list but has a grey icon and has a label of ‘(disconnected)’ beside its name. You can connect to the Server, Node or Vault by using the ‘Connect’ menu option.

Disconnect menu option

 

Terminal Window

A button has been added to the menu bar to open a terminal within the Toolkit.

button

 

You can also access it using Window->Show View->Other… and then selecting Terminal.

access button

 

A Launch Terminal window will be shown. If you select ‘Local Terminal’, then the console terminal will be shown.

Launch Terminal window

 

The terminal window inherits all the environment that was used to launch the Toolkit. This means that all the ACE commands can be used without needing to set up your PATH. This makes it handy to run commands within a console window without needing to minimise the Toolkit and open a separate console window.

Toolkit

 

0 comments
33 views

Permalink