At #txc2025, I held a tech talk titled Beyond the Tool: Planning Analytics as a Central Data Hub [1860], where I shared our journey from a traditional data warehouse architecture to a decentralized Data Mesh—centered around Planning Analytics.
I wanted to expand on the session here in a blog post, as I believe the experiences, challenges, and lessons learned could be valuable for others navigating similar transitions. And who knows—maybe someone out there has thoughts or feedback that could help refine the approach even further.
My Journey from Data Warehousing to Planning Analytics
I’d like to begin by sharing a bit about my background, as it’s highly relevant to the story I want to tell in this post.
I hold a degree in Economics but have worked in IT since the last century. My career began at a small ERP company that made a strategic decision early on: instead of building its own reporting solution, it chose to use third-party tools. That’s how I first got involved with Cognos, creating cubes and reports for ERP customers.
As that ERP system declined in the market, we expanded our reporting services to support other systems. In 2001, I built my first data warehouse, based on JD Edwards, and that marked the beginning of a long journey combining business data with advanced reporting solutions.
Later, we added Cognos Enterprise Planning to our stack, again for the same reason: the built-in ERP budgeting tools were insufficient. My role focused on delivering actuals and dimension data from the data warehouse, and integrating this with budgeting inputs to create complete Cognos reports.
Over time, the landscape evolved—Cognos introduced TM1, and was eventually acquired by IBM. As planning solutions grew in complexity, governance became a more critical issue. I used my data warehouse background and methodology to ensure more robust and stable planning solutions. Our TM1 developers often had strong finance knowledge, but lacked best practices in code and model management. I helped simplify and clean up Planning Analytics models by reducing unnecessary rules and processes.
Interestingly, it’s only within the last year that I’ve built solutions within Planning Analytics itself. The reason? The platform is on a new journey—moving toward becoming a standalone solution, rather than just an extension of a centralized data warehouse. And that shift is exactly what this story is about.
Traditional planning analytics project with data warehouse
As illustrated in the figure, Planning Analytics was introduced as an add-on to incorporate budgeting and forecasting capabilities. The process was closely aligned with the data warehouse structure, and the resulting reports were typically distributed via Cognos, and later through Power BI.
Within this architecture, the data warehouse served multiple key functions, including historical data storage, source system management, data aggregation, and dimension modeling.
As the planning solution grew in complexity, the need for stable and traceable logic became increasingly important. To address this, I leveraged the database layer to externalize the core logic of Planning Analytics. Rather than embedding complex SQL and processing logic within TI processes, I transitioned this functionality into views and tables within a dedicated database schema tied to the Planning Analytics solution.
This approach mirrors how TM1 operated within the data warehouse context. It was essential to avoid implementing custom solutions for Planning Analytics directly in the data warehouse. However, maintaining both within the same database instance proved highly efficient, as it allowed for seamless access control to shared facts and dimensions.