AIOps on IBM Z

AIOps on IBM Z

AIOps on IBM Z

AIOps on IBM Z is a group that brings together IT professionals to share their knowledge and expertise on leveraging AI-driven intelligence and IT Operations in order to accelerate decisions to maintain resiliency through the use of AIOps on IBM Z

 View Only

Streaming IMS records from IBM Z Common Data Provider

By Soniya Ranjan posted 9 days ago

  


Organizations running IMS on IBM Z often need real-time access to operational data for analytics and monitoring. IBM Z Common Data Provider (ZCDP) simplifies this by collecting IMS log records and streaming them to platforms like Apache Kafka. In this blog, we’ll walk through the steps to stream IMS 45 records from Z Common Data Provider Data Collector to a Kafka topic in a few simple steps.

Step 1 - Configure SMF and enable IMS User Exit (LOGWRT) on IMS

Ensure that both IMS and ZCDP are installed and running on your z/OS system. To begin, configure the IMS user exit on your z/OS system. This enables IMS log records to be captured by Z Common Data Provider. For detailed instructions, refer to the IBM documentation: Collecting IMS records by using IMS user exit.

Step 2 - Create Z Common Data Provider policy and start the Data Collector

Next, create a policy in the Z Common Data Provider UI to define how IMS records will be streamed to Kafka. Add the Kafka bootstrap server details, configure the data resources for IMS logs, and save the policy.

Policy profile edit

 

Add Kafka bootstrap server details

 

Add data resource details and save the policy

Sample snippet

// EXPORT SYMLIST=*
// SET CDPINST='/zAnalytics/zcdp510L/usr/lpp/IBM/zcdp/v5r1m0'
// SET CDPWORK='/zAnalytics/zcdp510L/usr/lpp/IBM/zcdp/v5r1m0/work'
// SET PATH1='/var/local/CDPServer/'
// SET PATH2='cdpConfig/'
// SET POLICY='dc_kafka.collection-config.json'
// SET JAVAHOME='/usr/lpp/java/J17.0_64'
// SET START='W'
// SET TZ=''
// SET FULLDATA='OFF'
// SET PTKTUSER='IBMUSER'
// SET WASLIB='/usr/lpp/zWebSphere/V9R0'
// SET DEFTHEAP='4096m'
// SET MAXIHEAP='4096m'
//*
//HBOCOL EXEC PGM=BPXBATA2,REGION=0M,TIME=NOLIMIT,MEMLIMIT=NOLIMIT,
// PARM='PGM &CDPINST./DC/bin/hbocol'

Step 3 - Trigger IMS 45 records

Use the following commands to verify IMS readiness and trigger IMS 45 records. The /che statistics IMS command triggers the IMS 45 records and all subtypes.

Get the Responses number for IMS READY, and then trigger the command /r xx,/che statistics.

Sample snippet

/d r,r
27 R *27 DFS996I *IMS READY* IMSO
12 R *12 HWSC0000I *IMS CONNECT READY* IMSOCON

Issue the following command to trigger IMS 45 records. Note down the numeric value against the IMS READY before running
/r 27,/che statistics.

Step 4 - Verify streaming to Kafka

Check the Data Collector job log for messages confirming that IMS 45 records are being streamed to the Kafka topic.

ZCDP jobs

Data Collector job log sample

INFO  r-logwrt : HBO1007I Starting to send IMS IMSO-59 data to Kafka topic IBM-CDP-zOS-IMS-45.
DEBUG r-logwrt : HBO1007I Starting to send IMS IMSO-4500 data to Kafka topic IBM-CDP-zOS-IMS-45.

"2025-04-14 08:15:51.040 GMT DEBUG r-logwrt : HBO1007I Starting to send IMS IMSO-4500 data to Kafka topic IBM-CDP-zOS-IMS-45.",

"2025-04-14 08:15:51.040 GMT DEBUG r-logwrt : HBO1007I Starting to send IMS IMSO-4502 data to Kafka topic IBM-CDP-zOS-IMS-45.",

"2025-04-14 08:15:51.041 GMT DEBUG r-logwrt : HBO1007I Starting to send IMS IMSO-4503 data to Kafka topic IBM-CDP-zOS-IMS-45.",

"2025-04-14 08:15:51.041 GMT DEBUG r-logwrt : HBO1007I Starting to send IMS IMSO-4504 data to Kafka topic IBM-CDP-zOS-IMS-45.",

"2025-04-14 08:15:51.041 GMT DEBUG r-logwrt : HBO1007I Starting to send IMS IMSO-4508 data to Kafka topic IBM-CDP-zOS-IMS-45.",

Verifying IMS 45 records in Kafka topic

You can verify records in Kafka topic by using the sample consumer scripts provided by Kafka.

For example:

./bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic IBM-CDP-zOS-IMS-45 --from-beginning &> IMS45Records.txt

The data will be in EBCDIC format. You can use an EBCDIC-to-ASCII converter to see the data in distributed systems.

Conclusion

Streaming IMS operational data to Kafka using Z Common Data Provider enables real-time analytics and integration with modern platforms. By following these steps, you can unlock valuable insights from IMS logs and bring mainframe data into your enterprise data ecosystem.

0 comments
71 views

Permalink