Planning Analytics

Planning Analytics

Get AI-infused integrated business planning

 View Only

Planning Analytics Unleashed: The Journey from Data Warehouse to Data Mesh

By Pål Risa Zachariassen posted 8 hours ago

  

At #txc2025, I held a tech talk titled Beyond the Tool: Planning Analytics as a Central Data Hub [1860], where I shared our journey from a traditional data warehouse architecture to a decentralized Data Mesh—centered around Planning Analytics.

I wanted to expand on the session here in a blog post, as I believe the experiences, challenges, and lessons learned could be valuable for others navigating similar transitions. And who knows—maybe someone out there has thoughts or feedback that could help refine the approach even further.

My Journey from Data Warehousing to Planning Analytics

I’d like to begin by sharing a bit about my background, as it’s highly relevant to the story I want to tell in this post.

I hold a degree in Economics but have worked in IT since the last century. My career began at a small ERP company that made a strategic decision early on: instead of building its own reporting solution, it chose to use third-party tools. That’s how I first got involved with Cognos, creating cubes and reports for ERP customers.

As that ERP system declined in the market, we expanded our reporting services to support other systems. In 2001, I built my first data warehouse, based on JD Edwards, and that marked the beginning of a long journey combining business data with advanced reporting solutions.

Later, we added Cognos Enterprise Planning to our stack, again for the same reason: the built-in ERP budgeting tools were insufficient. My role focused on delivering actuals and dimension data from the data warehouse, and integrating this with budgeting inputs to create complete Cognos reports.

Over time, the landscape evolved—Cognos introduced TM1, and was eventually acquired by IBM. As planning solutions grew in complexity, governance became a more critical issue. I used my data warehouse background and methodology to ensure more robust and stable planning solutions. Our TM1 developers often had strong finance knowledge, but lacked best practices in code and model management. I helped simplify and clean up Planning Analytics models by reducing unnecessary rules and processes.

Interestingly, it’s only within the last year that I’ve built solutions within Planning Analytics itself. The reason? The platform is on a new journey—moving toward becoming a standalone solution, rather than just an extension of a centralized data warehouse. And that shift is exactly what this story is about.

Traditional planning analytics project with data warehouse

Traditional planning analytics project with data warehouse

As illustrated in the figure, Planning Analytics was introduced as an add-on to incorporate budgeting and forecasting capabilities. The process was closely aligned with the data warehouse structure, and the resulting reports were typically distributed via Cognos, and later through Power BI.

Within this architecture, the data warehouse served multiple key functions, including historical data storage, source system management, data aggregation, and dimension modeling.

As the planning solution grew in complexity, the need for stable and traceable logic became increasingly important. To address this, I leveraged the database layer to externalize the core logic of Planning Analytics. Rather than embedding complex SQL and processing logic within TI processes, I transitioned this functionality into views and tables within a dedicated database schema tied to the Planning Analytics solution.

This approach mirrors how TM1 operated within the data warehouse context. It was essential to avoid implementing custom solutions for Planning Analytics directly in the data warehouse. However, maintaining both within the same database instance proved highly efficient, as it allowed for seamless access control to shared facts and dimensions.

Dataflow with a datavarehouse

Customers want to phase out the data warehouse and replace it with data products

With the introduction of Data Mesh and decentralized solutions, a strategic decision was made to phase out the centralized data warehouse. It would be replaced by domain-owned data products, with clear ownership anchored in the business units.

To support this shift, the IT department initiated the development of a new data platform to serve as the infrastructure for these data products. Planning Analytics was retained as a key capability, but would now operate independently of the data warehouse. This marked the beginning of the process to decouple Planning Analytics from the warehouse architecture.

Dataflow with a datavarehouse first step

The first step was to clearly establish the TM1 database schema as an integral part of the Planning Analytics solution. Ownership of the logic was therefore transferred to Planning Analytics, ensuring clean and consistent integrations with the data warehouse—limited strictly to facts and dimensions.

In summary, Planning Analytics assumed full ownership of the TM1 schema, and Oracle was incorporated into the solution to enable advanced capabilities such as stored procedures and materialized views.

This architectural separation made it possible to gradually migrate data products, allowing each to be transitioned when ready. As part of this initiative, we have begun moving toward a cloud-based data product by retrieving data using Python and TM1py.

Target architecture

Dataflow with dataproducts

The objective is to ensure that all data flowing in and out of Planning Analytics is defined as data products with clear ownership anchored in the business. In this architecture, Planning Analytics is no longer a standalone solution—it now relies on supporting components such as Oracle. However, we are evaluating alternatives like Databricks or Data Fabric, which better align with the customer’s selected data platform.

Planning Analytics has been decoupled from the centralized data warehouse and, together with its supporting tools, now operates as an independent system.

As the data warehouse is phased out, it remains crucial to preserve its valuable capabilities—such as staged history, version control, and snapshots—which are essential for maintaining data integrity and traceability.

To avoid unnecessary complexity, we continue to externalize as much logic as possible from Planning Analytics, leveraging support tools to maintain a clean and scalable solution architecture.

0 comments
2 views

Permalink