Hello All,
In this post you are going to know about how successfully,
- I was able to reduce the STAG calc time and enhance the model performance with an example.
- I've created custom report for Cost and Fallout metric in a single report.
- I've compared Datalink vs Datalink Classic and reason for leveraging Datalink over Classic.
Below the detailed description of the problem statement, the steps I took to achieve these objectives followed by the lessons learned from it.
Problem statement 1:
Business wants to reduce the STAG calc time to avoid delaying the PROD move each month, also they need to improve the performance of the model.
Procedure 1:
I've started exploring various options listed as below to increase the model performance and reduce calculation time.
- Got the calculation build from APPTIO support team for a build took long time to complete.
- Analyzed which report/model taking long time to complete.
- Reiterated complex formulas like nested ifs, lookups and lookup within an IF() and vice versa. Refer Example 1 below.
- Concatenating columns in mapping file and used one lookup instead of many, then used Split() function in the designated dataset. Refer Example 2 below.
- Reviewing data relationship between each modeled objects to ensure cost flow is accurate. Refer Example 3 below
- Deactivated unused reports. Refer Example 4 below.
Example 1: Avoid using nested functions.
- Calculation time can be reduced by breaking the complex nested if into several formulas.
- Below the formula has Lookup() function nested with an if() function
OLD:
IT Resource Sub Tower=if(Application Family="Enterprise",Lookup(Application Family,App Family to IT Resource Sub Tower,Application Family,IT Resource Sub Tower),"")
NEW:
IT Resource Sub Tower Lookup=Lookup(Application Family,App Family to IT Resource Sub Tower,Application Family,IT Resource Sub Tower)
IT Resource Sub Tower=if(Application Family="Enterprise",IT Resource Sub Tower Lookup,"")
Example 2: Reduce number of lookups.
- Calculation time can be reduced by using minimum number of lookups.
- Below example we are pulling {BRM} and {BU Owner} from 'GC to BRM Mapping' file using two lookups based on {Business Unit}
- Instead, we can concatenate aforesaid columns in 'GC to BRM Mapping' file, use one look up and split it in the source dataset.
OLD:
BRM LKP =Lookup(Business Unit, GC to BRM Mapping, Actual Group Companies, BRM)
BU Owner LKP =Lookup(Business Unit, GC to BRM Mapping, Actual Group Companies, BU Owner)
NEW:
GC to BRM Mapping' file:
BEM and BU Owner = BRM && BU Owner
Designated file:
BRM and BU Owner = Lookup(Business Unit, GC to BRM Mapping, Actual Group Companies, BRM and BU Owner)
BRM LKP = Split(BRM and BU Owner, 1, "-")
BU Owner LKP = Split(BRM and BU Owner, 1, "-")
Example 3: Reviewing data relationship
- Identified the allocation line and modified the allocations which has more granularity than expected.
- Optimizes the object identifier by removing the unwanted columns from the identifier.
- Verified all reports, which were built based on the model after reducing the granularity
- To make sure that cost flow is accurate, the data relationship key in the allocation was reviewed.
Example 4: Deactivated unused reports
- Provide the list of custom reports along with the calc time from build break down analysis to the business.
- Business responded with the list of active reports.
- Deactivated the list of reports which are not required by following below steps.
- Project Explorer à Reports à Checked out the report à Report tab à Click on ACTIVE check box to disable the report from calculation.
Problem statement 2:
Business needs a custom report which can tell them the Cost and Fallout numbers in a SINGLE report.
Procedure 2:
Customer was looking for Cost and Fallout metric in a single view of report, when we try created a custom report the report was showing common dollars between two objects Cost Source and Fallout.
To overcome this,
- We created a custom report called "Report A" with two tables one with attributes from Cost Source and another table with attributes from Fallout modeled object.
- Got the API URL of this report and load the data to a custom model called "Custom Model A" via copy table data connector type.
- Created a report called "Report B" using attributed from "Custom Model A".
- Scheduled the connector in such a way to refresh the report in a monthly cadence.
Problem statement 3:
Business needs a comparative cases study between Datalink Classic vs Datalink and try to understand the reason to use Datalink over Classic.
Procedure 3:
Performed the migration of datalink from classic since it has the following benefits also as a part of business process improvement.
- Multi-tenant: Bug fixes and new features can be auto deployed.
- Enhanced UX: Modern UX that is integrated with Apptio's Apex shell
- Rich feature set: Platform improvements such as bulk edit functionality, connector chaining & improved error messaging
- Growing connector library: Supports for complex ERPs such as SAP, Workday, as well as certified integration with SNOW
- Improved performance: Run more connectors in parallel and speed up data uploads
- Improved monitoring: Any Apptio default agent outage is actively monitored and taken up on highest priority.
Lesson Learned:
- Ensure the model performance prior to implementation while configuring complex transform steps in modeled objects.
- Erudite how to break a complicated formula down into simpler parts and how to define the object identifier in a model without performing extensive calculation.
- It is always advisable to use map columns in a master data and use append in premaster.
- Always keep the month-end close process in mind while constructing the model since the time frame and user experience will be impacted by the calculation time.
- Group the datasets like 01-Uploads, 02-Transforms, 03-Mapping, 04-Premaster etc. for ease of understanding for other TBMAs. Instead of deleting a dataset better to group it under a category called DO NOT USE.
Thanks
Prasanna Ravindran
#CostingStandard(CT-Foundation)