AIOps on IBM Z - Group home

OMEGAMON Data Provider + Logstash: A Gateway to the World of Analytics and AI

  

>>>>>>  Before I forget, quick PSA: 
If you came to this blog, you'll certainly be interested in this double header of upcoming webinars with great OMEGAMON content:

IBM OMEGAMON Technical Summit
, a 3-hour technical update on IBM OMEGAMON for Db2®, CICS® & MQ®
https://bit.ly/IOTS_S_9: June 8 / 9am ET | 2pm (UK)
https://bit.ly/IOTS_S_3: June 8 / 3pm PT | June 9 / 8am (Sydney) 

Leverage new IBM z16 features with confidence using IBM Z AIOps
https://ibm.biz/AIOps-0609-social June 9 / 9am ET | 2pm (UK)
Welcome Back

Welcome back to the blog series on the OMEGAMON Data Provider (ODP).  If you've missed the excitement about ODP, you'll want to read some of the six prior blogs listed in the Learn more, get involved section below.   The ODP has been a hit so far!  ODP content is coming fast, as these blogs show (z/OS, CICS, IMS, Db2, JVM, etc.).  My focus is integrations and use cases, which are coming fast too! 
 
Gateway to a World of Analytics and AI
 
I remember a high school trip to an amusement park.  Once all the kids were inside the gate, they fanned out in all directions, as there were so many rides, games, and food opportunities to enjoy.   So too, with OMEGAMON supporting the popular Logstash streaming platform, with ODP, there are many directions to pursue, lots of new opportunities to go after!  

Busy crowd at amusement park entrance
Image Credit:  unsplashed.com

I've streamed IT metrics to the ELK stack for years, using Logstash to manage inputs, but didn't pay much attention to the outputs, which was always Elasticsearch.  In more recent years, I've switched to the TICK stack, using InfluxDB, the very popular time series database (top in popularity on https://db-engines.com/en/ranking/time+series+dbms), along with Telegraf for managing inputs, and Grafana for visualization, so I got away from using Logstash.  Now that ODP has come on the scene, it got me thinking more about Logstash's output capabilities.
 
I was sure that Logstash supported a handful of outputs, but was shocked to see 60 supported output types!!!! 
https://www.elastic.co/guide/en/logstash/current/output-plugins.html
 
I was pretty sure InfluxDB would be on the list, and for sure, it was -- sweet!!   Check out the list!  Hopefully your favorite platform is already supported, whether it's Datadog, etc.  Comment below or hit me up on social media.  I'd love to hear which platforms you are glad to see on the list.
Diagram showing ODP data flowing from mainframe to logstash to ELK, Influxdb, and many other platfroms


Let's give it a try!!
 
Given my preference for InfluxDB in recent years, well suited for the time series data I use in analytics, I endeavored to have a quick go at streaming ODP data into InfluxDB, via the Logstash InfluxDB output plugin.  I installed Logstash on my InfluxDB server, and found plugins easy to use.  It was a simple command to install, and there's a command to list all your plugins:
 
root@myserver:/usr/share/logstash# bin/logstash-plugin install logstash-output-influxdb
Validating logstash-output-influxdb
Resolving mixin dependencies
Installing logstash-output-influxdb
Installation successful

root@myserver:/usr/share/logstash# bin/logstash-plugin list --verbose
...
logstash-output-csv (3.0.8)
logstash-output-elasticsearch (11.4.1)
logstash-output-email (4.1.1)
logstash-output-file (4.3.0)
logstash-output-graphite (3.1.6)
logstash-output-http (5.2.5)
logstash-output-influxdb (5.0.6)
logstash-output-lumberjack (3.1.9)
logstash-output-nagios (3.0.6)
...

Here's the Logstash config file I developed to configure the InfluxDB output. 

# Sample Logstash configuration for creating a simple
# tcp -> Logstash -> Influxdb pipeline.
input {
    tcp {
        "id" => "omegamon_tcp_input"
        "port" => 15046
        "codec" => json_lines
    }
}
output {
   if [table_name] == "syscpuutil" {
      influxdb {
         "db" => "omeg1"
         "host" => "192.168.0.0"
         "port" => "8086"
                    "coerce_values" => {
            "average_cpu_percent" => "integer"
            "average_ifa_on_cp_percent" => "integer"
            "average_ifa_percent" => "integer"
            "average_unused_group_msus" => "integer"
            "average_ziip_on_cp_percent" => "integer"
            "average_ziip_percent" => "integer"
            "four_hour_msus" => "integer"
            "host" => "string"
            "interval_seconds" => "integer"
            "managed_system" => "string"
            "mvs_overhead" => "integer"
            "port" => "integer"
            "product_code" => "string"
            "smf_id" => "string"
         }
         "data_points" => {
            "@timestamp" => "%{@timestamp}"
            "@version" => "%{@version}"
            "average_cpu_percent" => "%{average_cpu_percent}"
            "average_ifa_on_cp_percent" => "%{average_ifa_on_cp_percent}"
            "average_ifa_percent" => "%{average_ifa_percent}"
            "average_unused_group_msus" => "%{average_unused_group_msus}"
            "average_ziip_on_cp_percent" => "%{average_ziip_on_cp_percent}"
            "average_ziip_percent" => "%{average_ziip_percent}"
            "four_hour_msus" => "%{four_hour_msus}"
            "host" => "%{host}"
            "interval_seconds" => "%{interval_seconds}"
            "managed_system" => "%{managed_system}"
            "mvs_overhead" => "%{mvs_overhead}"
            "port" => "%{port}"
            "product_code" => "%{product_code}"
            "smf_id" => "%{smf_id}"
         }
      }
   }
}

I used the influx CLI tool to create an omeg1 data base with policies I wanted.  I already have been streaming ODP data to ELK for months, so it was easy to add a stream for another Logstash destination.   Once I had the metrics I wanted flowing into InfluxDB, I did some quick checks with the influx CLI tool, to make sure I had data types I wanted.  Lastly, I added my omeg1 data source to my Grafana, and started exploring.

Line graph of ODP metrics in Grafana


I've blogged about my use of ELK and the provided ODP sample dashboards earlier in the ODP series (see blog list at the bottom of this blog).  I quickly recreated the ODP sample Kibana System CPU dashboard in a Grafana dashboard, using my new InfluxDB data.  Whew, the data looks the same!!  ;-)
Kibana dashboard showing ODP metricsGrafana dashboard showing ODP data

After doing the detailed, hands-on Logstash config file above, where I picked metrics, and made sure the data typing was correct, I wondered, if there was an easy, hands-free, pass-through kind of Logstash config file possible for the InfluxDB plugin.  I was hoping I might generically send all the metrics through, and that the InfluxDB plugin would magically create the tags correctly and type the data correctly in InfluxDB.  I quickly found this magic does exist -- sweet!!   I was nervous, the data would turn out to be all strings, but I was pleasantly surprised.  The plugin seemed to get all the data typing correct too!  use_event_fields_for_data_points did the magic.  Here's the simple, 18 line Logstash config file I used:
 
# Sample Logstash configuration for creating a simple
# tcp -> Logstash -> Influxdb pipeline.
input {
   tcp {
      "id" => "omegamon_tcp_input2"
      "port" => 15047
      "codec" => json_lines
   }
}
output {
   if [table_name] == "syscpuutil" {
      influxdb {
         "db" => "omeg2"
         "host" => "192.168.0.0"
         "port" => "8086"
         "use_event_fields_for_data_points => "true"
             "data_points" => {}
      }
   }
}
 
Bigger Picture

Other IBM and partner product are embracing this direction of streaming data to ELK.   My colleague Nina Mirski-Fitton recently blogged about CICS products, like CICS PA, streaming to ELK  https://community.ibm.com/community/user/ibmz-and-linuxone/blogs/nina-mirski-fitton1/2022/04/27/why-you-should-consider-using-the-elastic-stacktm.   The IBM Z Operational Log and Data Analytics (IZLDA) product also streams log messages and SMF metric data to ELK.   So here too, many opportunities abound.  The gateway has been opened, by providing Logstash support.  It is now quite easy, no code required, to get mainframe data from many tools, like ODP, CICS PA and IZLDA, streaming to 60 of your favorite Logstash output destinations!! 

Customer examples really hit home for me.  For three customer examples of the gateway being open to many possibilities, see the Customer experience section of Jim Porell’s recent blog:  here
 
I haven't done enough research, but I believe there's a similar, rich streaming ecosystem, with the Apache Kafka platform.  ODP supports Kafka as well.  Hit me on social media or comment below, if you see Kafka could a great Gateway to the World of Analytics and AI, like I've found Logstash to be!!   Maybe we could collaborate on giving Kafka a try with ODP data.


Learn more, get involved:

 ⇒  What to compare notes, hints, hit me up on social media:  drdavew00!

⇒ Try the Elastic Stack container for an easy Proof of Concept after installing IZODP  https://github.com/z-open-data/z-omegamon-analytics-elastic-docker

⇒ Try the Prometheus Github to monitor your IZODP connector data flow https://github.com/z-open-data/odp-prometheus-grafana

ODP Installation and User's Guide

⇒ Learn more and do some planning for install:  https://community.ibm.com/community/user/ibmz-and-linuxone/blogs/james-porell1/2021/11/07/installation-considerations-for-omegamon-data-prov

⇒ Blog 1: Analytics dreams really do come true with the new IBM Z OMEGAMON Data Provider

⇒ Blog 2: Unlocking OMEGAMON Data for Analytics

⇒ Blog 3:  ITOM mash ups: Music to your ears!

⇒ Blog 4:  OMEGAMON Data Provider Update: Now streaming CICS and Db2

⇒ Blog 5:  CICS and Db2 Dashboards for ODP now available and more!

⇒ Blog 6:  IMS and JVM streaming via OMEGAMON Data Provider now available