Apptio for All

Apptio for All

A place for Apptio product users to learn, connect, share and grow together.

 View Only
  • 1.  Model Allocation Capabilities

    Posted Fri July 15, 2022 09:29 AM
    Hey everyone - Apolgies for the double post but I didnt realize you cant respond when you post in the questions section! 

    I'm doing some strategy work around optimal allocation strategies and and was wondering what your experiences are.  

    What is an acceptable row count to have pushed through the model (Pre - Post) that still offers high performing? Other than reducing granularity, what have you been able to do to imrpove model performance?  


    Thank you @Jenny Franklin for the quick responses on the other thread, I've posted them below for everyone to see.
    _____________________________________________________________________________________________________
    Hi @Mac Jones, what I usually do is go to the Cost metric, and then select 'Allocation Strategies', which you probably have already done - just adding additional info in case someone needs it...


    Once you get there, ​sort the Row Count column in descending order, and then target those areas with > 100k rows.  

    Helpful •  0
    Jenny Franklin's profile
    EXPERT Jenny Franklin 

    @Mac Jones as far as additional tips to improve performance, here's a list of things to check:

    • Number of Allocations (I like to keep it to OOTB as much as possible)
    • Allocation Rowcount (keep it under 100k rows)
    • Blank Data Relationships (ensure you have proper data relationships for your allocations)
    • LookupEx / Tablematch / SplitEx Formulas (keep these in separate Formula steps - one per)
    • Missing Identifiers (ensure all objects have object identifiers - even where a Model step was added for reporting purposes)

    #ApptioforAll


  • 2.  RE: Model Allocation Capabilities

    Posted Fri July 15, 2022 09:31 AM
    Hi @Mac Jones, you are most welcome.  I have made updates to the post, so posting the link here for reference.  Thanks! 😊​


  • 3.  RE: Model Allocation Capabilities

    Posted Fri July 15, 2022 09:33 AM
    @Jenny Franklin - The allocation rowcount piece is most interesting to me and something I've had trouble with finding good information on.

    Lets say you are at 100k rows exactly - What do you think performance will be like within the engine?  How about 200k or even 500k rows?  Would it be usable?

    I'm asking because I'm currently working with some very detailed allocations and would like to say if we got to 100k, 200k, or 500k rows - this is what the performance would be like​.  I know theres many factors here so estimates are fine with me. 

    We're working on a new allocation startegy to get away from the current granularity in the model, but I'd love to be able give good estimates on performance against granularity.​​


  • 4.  RE: Model Allocation Capabilities

    Posted Fri July 15, 2022 11:49 AM

    @Mac Jones​, Let me reach out to see if we can find someone who has more information on that. 

    Unfortunately, I don't have an exact (or even rough) estimate of how it would improve things - different models, etc. 

    Can definitely say, though, that in one model I was working in that was super complex, once we got the row count down from a whopping 5m rows to <100k, it was very zippy. ;)  That's an extreme example, though. 

    Hoping someone can pop on and give us some general, ballpark estimates, but given how everyone's model is different, not sure if that's doable.  Probably thinking there won't be a "lot" of improvement going from 500k down to 100k - maybe slight. 

    When you really see it is when you have crazy row counts like that 5m I mentioned.  Definitely look into the other stuff, though.  What I failed to mention was that list is ranked - the top stuff is the most critical.  Pasting in since it's referenced on a different page.

    • Number of Allocations (I like to keep it to OOTB as much as possible)
    • Allocation Rowcount (keep it under 100k rows)
    • Blank Data Relationships (ensure you have proper data relationships for your allocations)
    • LookupEx / Tablematch / SplitEx Formulas (keep these in separate Formula steps - one per)
    • Missing Identifiers (ensure all objects have object identifiers - even where a Model step was added for reporting purposes)
    • Object RowCount Reduction



  •   5.  RE: Model Allocation Capabilities
    Best Answer

    Posted Fri July 15, 2022 03:02 PM

    @Mac Jones There is a very long talk track on this, but the short version is, as small as possible to accomplish your goal.  Every row in the allocation is representing the concepts that exist at the two objects, and the value on that ​row is the number of units.

    With that in mind, I tend to ask customers to think of the units (dollars) being moved along that allocation, and if the granularity is necessary for conceptually what is occurring.  IE if you have a 100,000 row allocation, and are moving 10$ of cost, each row is representing 1/10 of a cent.  Does this make sense for what is actually occurring on the allocation, is this going to result in anything actionable.  Likewise, if you have a 100,000 row allocation and are moving 100,000,000 dollars, each row is 1000 dollars and representing an actionable amount of spend.

    Essentially, is the level of granularity reasonable for the decisions you can make with that information and the spend being moved aross it.

    Lastly, the type of allocation matters a 100,000 row All to All (Many to Many non databased) allocation is going to have much larger ramifications to the overall performance than a 100,000 row allocation that is a 1:1 or Many:1 databased allocation.  This is because the All to All allocation is going to cause everything on both sides of that allocation to be related to each other going up the rest of the model.

    Anecdotally, 100,000 rows is not a large allocation by any means and it by itself shouldn't cause any issues, but that doesn't mean that all allocations should be 100,000 rows, or you should strive to never go over 100,000 rows.

    The number of allocations matters, as does the data relationship between all of them, the size of the row counts on the objects, are all in most cases more important than the actual size of the allocation itself.

    In regards to what you can do, there are really only 3 things, reduce the source granularity, reduce the destination granularity, increase data relatedness (more 1:1)

    Allocations/Model Object Identifiers should only contain what you are...

    Reporting on (Grouping By)
    Trending on
    Allocating by
    Slicing by (Filtering by)

    If you have granularity in your object that you're doing one of those things with, you should remove it.

    Let me know if you have any other questions on this.