A few companies have asked about unit cost changes observed across 2021, 2022 and 2023 infrastructure benchmark data. The two specific areas were Storage:Online and Compute:Servers. To answer this question, I will first provide an overview of the benchmarking process and then address specific comments for Compute and Storage.
Benchmarking Process
IT infrastructure benchmarks, such as cost per server, are determined through a combination of industry research, company-specific data, and standardized models. These benchmarks provide organizations with a way to measure their IT spending and efficiency against peers and industry standards. Here's an overview of how these benchmarks are typically determined:
Data Collection: The first step involves collecting data from a variety of sources. This data can come from internal company records, industry surveys, and reports from market research firms. For cost benchmarks, data might include total cost of ownership (TCO), operational expenses, hardware, software, maintenance costs, and energy consumption for servers.
Normalization: Since IT environments and business operations can vary significantly from one organization to another, it's crucial to normalize the data. This process involves adjusting the data to account for differences in company size, industry sector, geographic location, and other relevant factors. Normalization ensures that the benchmarks are meaningful and comparable across different contexts.
Standardization: To ensure consistency, benchmarking organizations often use standardized models and definitions. For example, the cost per server benchmark would be based on a standard definition of what constitutes a "server" and what costs are included (e.g., acquisition, maintenance, software licenses, and indirect costs such as data center space and cooling).
Analysis: The collected and normalized data is then analyzed to establish benchmarks. This can involve statistical analysis to determine average costs, median values, ranges, and percentile rankings. The analysis might also identify patterns and trends that can inform benchmarking metrics. Further,
Adjustment and Iteration: Benchmarks are not static and are regularly updated to reflect changes in technology costs, adoption of new technologies (e.g., cloud computing, virtualization), and evolving business practices. Continuous data collection and analysis are necessary to keep benchmarks relevant and useful. Reviewing the benchmark trends over time is also used to make final adjustments and eliminate outliers. Apptio works with out benchmarking providers to review new benchmarks, provide feedback and questions for outliers, and provide input to the final adjustments.
2023 Benchmark Outliers
Compute > Servers Multiple factors contributed to the large 2023 unit rate reduction (based on data collected through end of year, 2022).
- The average server size decreased by ~20% in 2022 which materially impacts unit costs,
- Prior year costs saw annual unit rate reductions (~7-10% per year), and
- The 2022 unit rate decrease is likely overstated.
- Upon further review, Apptio and our ISG benchmark provider adjusted the 2022 unit rate to ~$2,400 (August 22, 2023 release), which still represents a considerable decrease, but is more in-line with expected market ranges.
- Some of this decrease can be attributed to the use of smaller server size and a "catch-up" in unit costs for prior years.
Storage - Storage unit costs typically reduce 15-25% annually which has been a consistent trend due to continual advancements in technology and scale. The reductions for 2022 exceeded normal annual trends. While ISG expects Storage costs to continue to reduce, the decrease observed in 2022 was likely overstated so we applied a fix to limit the 2022 cost decrease which keeps the multi-year trend aligned with the 15-25% annual reduction.
Compute > Servers benchmark screenshot

Storage > Online benchmark screenshot

#Benchmarking(ITBenchmarking)