IBM Destination Z - Group home

IBM DB2 Analytics Accelerator 101

By Destination Z posted Mon December 23, 2019 03:29 PM

The IBM DB2 Analytics Accelerator is an appliance add-on to your mainframe. So, in effect, it’s a separate box that connects to the mainframe.

The Analytics Accelerator, along with DB2 for z/OS, make up a self-managing, hybrid workload optimized database management system that runs query workloads in the most efficient way. What that means is that queries are executed in the optimal environment for the greatest performance and cost efficiency.

This hybrid computing platform on zEnterprise:

  1. Supports transaction processing and analytics workloads concurrently, efficiently and cost-effectively
  2. Delivers industry-leading performance for mixed workloads
Because the Analytics Accelerator is a separate box, it doesn’t use mainframe MIPS. In fact, it runs on PureData for Analytics powered by Netezza technology (IBM bought Netezza, the company, in 2010). The advantages of this setup include:

  1. Unprecedented response times for ‘right-time’ analysis
  2. Complex queries run in seconds rather than hours
  3. Transparent to the application
  4. Inherits all z Systems DB2 attributes
  5. There’s no need to create or maintain indices
  6. It eliminates query tuning
  7. There’s fast deployment and time-to-value
The tables need to be defined and deployed to Analytics Accelerator before data is loaded and queries sent to it for processing. Definition means identifying tables for which queries need to be accelerated. Deployment is making tables known to DB2, i.e., storing table metadata in the DB2 and PureData catalog. IBM DB2 Analytics Accelerator Studio, which has a GUI, guides you through the process of defining and deploying tables, as well as invoking other administrative tasks.

Stored procedures implement and execute various administrative operations such as table deployment, load and update, and serve as the primary administrative interface to Analytics Accelerator from the outside world including IBM DB2 Analytics Accelerator Studio. Stored procedures can be run from the command line or embedded in custom applications. The stored procedures provide functions that are related to tables and accelerators. All stored procedures commit the transactions that were triggered by the calling applications.

But the tool isn’t just for DB2. Users want to learn more about their investment in the Analytics Accelerator and maximize its use in their environment. These customers are looking at creative ways to exploit it for IMS, VSAM, SMF data and non-z/OS data.

There are three different areas where tools can provide value. They are:

  1. Assessment: Do I have a workload that would benefit from Analytics Accelerator?
  2. Optimization: Can I optimize the workload to take advantage of Analytics Accelerator?
  3. Administration: Can I manage Analytics Accelerator more effectively?
How does data get loaded into Analytics Accelerator? There’s the standard DB2 UNLOAD utility that extracts data. It places tables in read-only mode until Analytics Accelerator load is done (when using the option for transactional consistent data). Analytics Accelerator reflects DB2 data at the point in time the load was initiated. You could, optionally, use Change Data Capture for near-real-time replication to Analytics Accelerator

The new IBM DB2 Analytics Accelerator Loader V1.1 provides group consistent load and external ‘dual’ load. It’s built for performance and usability. The Loader populates data in Analytics Accelerator. It can load DB2 and non-DB2 data. It loads in parallel to avoid application downtime. And it can load to an historical point-in-time.

The Analytics Accelerator Loader can load data from a file in one of two ways: dual external load and Analytics Accelerator only. With dual external load, it loads data into both DB2 and Analytics Accelerator in parallel. With Analytics Accelerator only, it loads directly into Analytics Accelerator (no load in DB2).

The user is responsible for building the load file. Extracted data can come from various sources, e.g., IMS, VSAM, Oracle, etc. The file must be compatible for input into the DB2 LOAD utility. Field specification must describe the input data format. This must be compatible with the DB2 LOAD utility. Null-able columns or ‘not null with default’ can be missing from input data. Defaults will be supplied by the Analytics Accelerator Loader.

Note: The information in this blog is taken mainly from a presentation recently given by Rocket Software’s Wayne Morton to the Virtual IMS user group.

Trevor Eddolls is CEO at iTech-Ed Ltd, an IT consultancy. A popular speaker and blogger, he currently chairs the Virtual IMS and Virtual CICS user groups. He’s editorial director for the Arcati Mainframe Yearbook, and for many years edited Xephon’s Update publications.