When I joined my TBM team, I noticed that we had a Big Challenge in ingesting Mainframe CPU Data. We had a huge amount of Mainframe CPU across 19 LPARs (Logical Partitions) and some of these LPARS had more than a million records per month. We would receive daily files for these 19 LPARs and would accumulate this for the monthly upload. However, since some of these files were so huge, this was taking a lot of time.
While analyzing the data, it became clear to us that there were some gaps in the data. We did not have data for all 24 hours. This was a Big Issue because our invoice was based on hourly consumption. Another issue that we noticed was that there were some duplicate data for some LPARs. A lot of time was spent to make sure that we have accurate data for all of 24 hours each day and to eliminate the duplicate records.
While we were struggling with Mainframe data collection and validation for our monthly data load, we got another Ask – the need to load Mainframe Data Daily instead of Monthly. The reason for this Ask was to identify peak hour usage well before the month end. At this point it became very clear to us that we need to move to an Automated Solution.
We decided to use Apptio Datalink/Datadrop solution to automate our Mainframe Data ingestion. I looked into Apptio Help documentation and provisioned the Datadrop Server. Tested the connectivity with our source system. Configured Datalink Connectors to load the data files.
Since we had a lot of data issues, I created a new project using TBM Studio. Used the automated Datalink/Datadrop process to load the data into this new project daily, where I created some validation reports, listing hourly data. Since this is a new project, with not many reports to calculate, the calculation was over in a few minutes. Used a transform step to eliminate duplicate records. Any missing data could easily be detected from the validation report, and we could request the source system immediately. Using a separate set of Datalink Connectors, we transmitted the accurate data from this new project to our Cost Transparency project, at the end of day.



With this automated solution using Apptio Datadrop/Datalink and Data Validation using Apptio, we are now able to provide a daily refresh of accurate Mainframe CPU data to our User Community. It is now possible to analyze the hourly CPU Consumption and identify Reengineering/Cost Reduction Opportunities to reduce the monthly bill. After this Success, our Mainframe Team asked us to automate Mainframe Storage, CICS and DB2 data into Apptio.
Lessons Learned
- Data Quality is Very Important – It is not enough to quickly load data , it also has to be accurate and reliable.
- We created a New Project in Apptio for data cleansing/validation. The calculation time is very less for new projects because there are no additional reports to calculate.
- Even if there is no direct connectivity between two systems(like Mainframe and Apptio), automation is still possible using an intermediate system(like Sterling Integrator).
Link to my Presentation.
@Debbie Hagen
#ApptioforAll