Mainframe Storage data was inconsistent which was causing inaccurate client allocations
Having accurate data is important for accurate TCO and accurate client allocations. We received inconsistent data from the mainframe team that was causing client allocations to have large swings, which caused clients to question the data. As a result we did not update the data in Cost Transparency, keeping the data unchanged until a solution could be worked out.

Journey to Improvement
Summary data tables in Apptio made these issues seem obvious when the totals for storage are exposed in our reports. We had multiple sessions partnering with the Mainframe Team to brainstorm how we could improve data quality, data accuracy, and get more accurate client allocations. Many ideas were floated around, and we modeled possible solutions. We eventually settled on a way where Apptio could do more of the processing and eliminate the Mainframe Team having to maintain separate manual lookup tables.
The mainframe team previously provided an application name and a GB weighting to be used for the mainframe storage allocations. They relied on lookup tables that they maintained, and Technology Finance had no way to review or validate these lookups.
The new method would use existing algorithms using Job Codes, LPAR and data set names as part of the new methodology. This means that we would eliminate some steps from the Mainframe Team as they would have one less table to maintain. We also reduce complexity by using similar logic to other mainframe processing and reuse existing lookup tables.
Utilizing the SDM project
The record sizes (~800k) are such that we would need to group the data down before we did our lookups. We needed to group the data first, and then reduce the sizes of the data sets that were accessed. This data grouping in our SDM project reduced the size of our data by ~95%. We use a datalink connector to copy this data over to the Cost Transparency project for the application lookup logic.
Each time a data set was listed it showed the size of the entire data set that was referenced by a mainframe process. This would inflate the size of the data set that was referenced by mainframe calls so we used a counter to scale down the record set referenced for each time that it was referenced.
Keep Probing – Keep Talking
Without having mainframe storage in Apptio to begin with we would not have been able to see that there was an issue with the data. One function of Apptio is to facilitate conversations. These conversations never would have taken place without the Cost Transparency model. Change is hard so you need to keep asking questions and if are persistent you should eventually find a way to make an improvement. Allocations are not more defendable and more accurate and we are seeing better consistency in the data.
Future Enhancements
With the new methodology in place and a monthly task by the Mainframe Team being automated our next focus will be on ways to streamlining and reducing costs.
@Debbie Hagen
#CostingStandard(CT-Foundation)