Message Image  

Forwarding a live feed of IMS Connect events to Splunk

Forwarding a live feed of IMS Connect events to Splunk 

Tue November 23, 2021 02:35 PM

Forwarding a live feed of IMS Connect events to Splunk

Live-streaming IMS Connect transaction performance data as JSON Lines to a Splunk TCP data input

Forwarding IMS Connect events to Splunk


Overview

Skill Level: Any Skill Level

You need to know how to use z/OS ISPF and Splunk: how to edit and submit JCL, how to view batch job output, how to edit Splunk configuration files, and how to perform searches in Splunk.

IBM IMS Connect Extensions for z/OS can capture events from running IMS Connect systems, consolidate the events into one record per transaction, and then forward the consolidated records as JSON Lines over TCP to a remote Splunk system.

Ingredients

You will need:

  • Running IBM IMS Connect systems.
  • IBM IMS Connect Extensions for z/OS. This recipe uses the IMS Connect Extensions feed, introduced in IMS Connect Extensions V3.1.
  • Splunk.

Step-by-step

  1. Activate the IMS Connect Extensions publisher API

    Flow of IMS Connect transaction index to Splunk


    The IMS Connect Extensions feed is a client of the IMS Connect Extensions publisher API. The feed uses the publisher API to get IMS Connect events.

    For each IMS Connect system that you want to use as the feed source, select the Activate Publisher API option in the IMS Connect Extensions ISPF dialog.

    For details, see the procedure for starting a feed in the IMS Connect Extensions documentation in IBM Knowledge Center.

  2. Create a Splunk TCP data input

    To ingest JSON Lines sent via TCP, you need to configure a Splunk TCP data input that breaks each line of the stream into a separate event, recognizes event time stamps, and specifies the event data format as JSON.

    The following Splunk configuration stanzas define a minimal configuration for ingesting JSON Lines over TCP: one stanza in inputs.conf, and one in props.conf.

    Depending on your own site practices, you might perform additional Splunk configuration, such as assigning different source types, routing events to different indexes, or using secure TCP.

    Location of Splunk configuration stanzas

    This recipe refers to Splunk configuration (.conf) file names, but not directory paths. It is your decision where to store the Splunk configuration stanzas.

    For example, you might choose to create a Splunk application directory named your-organization-cex specifically for this IMS Connect Extensions feed configuration, and save the configuration files there:

    $SPLUNK_HOME/etc/apps/your-organization-cex/local/*.conf

    inputs.conf

    The following stanza in inputs.conf defines an unsecure TCP input that listens on port 1515, assigns the source type “ims-ca20” to all incoming events, and stores the events in the default index (typically, main):

    [tcp://:1515] 
    sourcetype = ims-ca20

    The port number and source type shown here are examples only. The actual values are your choice.

    The example source type “ims-ca20” matches the value of the type property in the feed JSON.

    props.conf

    The following stanza in props.conf defines the properties of the source type that you specified in inputs.conf:

    [ims-ca20] 
    SHOULD_LINEMERGE = false
    KV_MODE = json
    TIME_PREFIX = {\"time\":\"
    TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%6N%:z

    The combination of SHOULD_LINEMERGE = false and KV_MODE = json defines the incoming data as JSON Lines: one event per line, data in JSON format. These two settings apply to different stages in the Splunk data pipeline: SHOULD_LINEMERGE applies to parsing, before indexing; KV_MODE applies later, to search-time field extraction.

    The regular expression for TIME_PREFIX is case sensitive; it matches the lowercase field name time, which is the field name for event time stamps in the feed JSON. TIME_FORMAT matches the format of the time field value: a date and time of day string value in ISO 8601 extended format. See the example feed JSON in the next step.

  3. Start a feed job

    Submit JCL to start an IMS Connect Extensions feed.

    Example JCL

    The following JCL defines a feed that forwards data from three IMS Connect systems: ICONP01, ICONP02, and ICONP03. The feed consists of selected fields in JSON Lines format sent over unsecure TCP/IP (no SSL/TLS) to port 1515 on the host named “analytics”. The host is running Splunk configured with the TCP data input described in the previous step.

    //UIDCEX   JOB NOTIFY=&SYSUID
    //CEXCA20 EXEC PGM=CEXCA20P
    //STEPLIB DD DISP=SHR,DSN=cexpre.SCEXLINK
    //SYSPRINT DD SYSOUT=*
    //SYSIN DD *
    HWSID=ICONP01,ICONP02,ICONP03
    DESTINATION=JSON
    HOST=analytics PORT=1515
    FIELDS(
    clientid,
    hwsname,
    ipaddress,
    port,
    readexit,
    originds,
    targetds,
    trancode,
    tmember,
    userid,
    tpipe,
    logontk,
    otmadelay,
    inputelap,
    rdsockelap,
    readxelap,
    rxmlxelap,
    safelap,
    otmaelap,
    xmitxelap,
    rdackelap,
    confelap,
    trackelap,
    rtpelap,
    resptime,
    outputelap
    )
    /*

    The list of fields in this example JCL matches the fields used by the Splunk app mentioned later in this recipe.

    Example output line:

    {"time":"2018-11-01T15:25:03.123456Z","type":"ims-ca20","hwsname":"ICONP01","resptime":0.654321,"trancode":"TRNA", ...}

    For more details, see the feed JCL and list of feed fields in the IMS Connect Extensions documentation in IBM Knowledge Center.

     

    Monitor your Splunk license usage

    Depending on the transaction rate of your IMS Connect systems and the number of fields you choose to forward, the IMS Connect Extensions feed can generate a high volume of data. As with any data that you index in Splunk, you must monitor and manage your indexing volume.

  4. Analyze the data in Splunk

    To check that Splunk has successfully indexed your feed data, enter the following search in Splunk Web:

    sourcetype="ims-ca20"

    You can use the feed data in Splunk to analyze the performance and behavior of your IMS Connect transactions.

    Example dashboard

    Here is an example Splunk dashboard that shows the feed data:

    IMS Connect Extensions feed to Splunk workload mapping dashboard


    This screen capture of the dashboard shows two visualizations:

    • A sankey diagram that shows the relationships between two identifier fields from the feed data. In this example, the two identifiers are the original and target IMS data store for each IMS Connect transaction. The original IMS data store is specified in the IMS request message (IRM) from the IMS Connect client. Routing rules in IMS Connect Extensions can direct requests to different target IMS data stores. Different original and target IMS data stores are evidence of routing. In this example diagram, all requests have been routed to target IMS data stores that are different to the original IMS data stores.
    • A histogram of transaction counts by original IMS data store.

     

    Get the Splunk app

    The example dashboard shown here is one of several dashboards in the IMS Connect transaction analysis Splunk app. You can get the app in two ways:

    • From the Splunkbase website. Use this option to learn more about the app and to install the app in an existing Splunk system.
    • From the Docker Hub website, in a Docker image that contains Splunk configured with the app and sample data. Use this option to evaluate the dashboards with no additional software required. You can start a Docker container, and then immediately use the dashboards.

     

    Want to forward historical data, not live data?

    Suppose that, instead of forwarding live data, you want to forward IMS Connect transaction performance data that has been recorded in a data set: specifically, in an IMS Connect transaction index.

    You can use IBM Transaction Analysis Workbench for z/OS (Workbench, or TAW), to forward IMS Connect transaction indexes over TCP in exactly the same JSON Lines format as the IMS Connect Extensions feed.

    Here are two ways to create an IMS Connect transaction index:

    • Use the IMS Connect Extensions feed to write to IMS Connect transaction indexes instead of streaming JSON Lines over TCP.
    • Use IBM IMS Performance Analyzer for z/OS (IMS PA) to create IMS Connect transaction indexes from IMS Connect Extensions archive journals.

    Given time, and interest from readers (feel free to comment), I’ll write a recipe about this.