Cognos Analytics

 View Only

Creating an optimised Data Architecture using Data Modules without needing a DataWarehouse

By Andrew Copeland posted Thu October 13, 2022 02:40 PM

  
Question: How can I get a fast BI reporting and dashboarding environment for smaller enterprises without the budget for a Data Warehouse / Data Lake?

I have been asked this question a few times over the past couple of years?

In the perfect world, you would have your perfectly formed Kimble certified Data Warehouse which performs well and gives you everything that you need. However, in this day and age we have to develop FASTER and CHEAPER than before to DELIVER RESULTS. Additionally, you might not have the skillset to develop these structures yourself or the budget to get someone to do it for you.

So, in this blog I will show you a methodology to do all these in IBM Cognos Analytics with Watson today. 

What makes this possible?  Well thanks to IBM's innovations since Cognos Analytics was released, we now have all the tools we need namely:

  • Data Modules
  • Data Sets
  • Data Module Linkage
  • Calendar Manipulation and Special Time Categorisation

Architecturally, the environment would look like this:

In order to explain my thinking let's start at the bottom. 

As an enterprise, we have a number of data sources that we use for reporting, in order to put them together into a staging environment we use what I have termed a FOUNDATION DATA MODULE. The foundation data module is simply a collection point for all the data sources and tables / views from them in a similar way to the staging layer of a traditional Data Warehouse. This allows us to collect everything in one place for now. No end users would have access to this data module.

From that Foundation Data Module, we build a series of IBM Cognos Data Sets which we can summarise and transform our data to start making it ready for end user reporting. Creating Data Sets is really easy and if you keep to the best practice of under 8 million rows of data you will achieve great query performance. This is the equivalent of the transformation layer in the Data Warehouse.

Now we have our summarised data in Data Sets we can create more REPORTING DATA MODULES which are subject specific, you can decide how much or how little information you want to put into them. This layer gives you the ability to add relationships, grouping, context, calculations etc into your metadata. They are also the point where your end users will connect to in order to construct their Dashboards and Reports.

One of the key functionalities that make this architecture a possibility is the ability to link Data Modules together. In order to save on development time and effort we create REFERENCE DATA MODULES which hold our standard dimensional structures. You can link these to any of the reporting data modules to reference data modules and instantly you have all the function and feature of them. It also means that annoying change to one dimension only has to be made once and its instantly available to all.

With the release of Special Time Category support in Version 11.1.4 onwards we have had the ability to create MTD, QTD, YTD, Prior Year reporting literally out of the box. With this development, I have taken the Company Calendar, put it into a Reference Data Module, coded the Special Time Categories and then referenced it in Reporting Data Modules very quickly.

Benefits and Challenges
There are a few benefits to this approach, namely:
  1. Speed of development, you can have something up and running even as a POC very quickly.
  2. You don't need expert ETL, Database Management skills or colleagues.
  3. Lower Operating Costs as you don't need to maintain and run a DataWarehouse.
  4. You are in total control of the design and build.
  5. Data Sets can be provided to people via Cognos Reporting as Data subscriptions.
However, it has to be said that there are a few challenges as well, namely:
  1. You can't surface the data to any other BI tool.
  2. Data Governance, cleansing and preparation is not performed in the industry best practice way.
  3. The volumes of data you have may preclude this approach.

Finally, as a disclaimer here, I confess to being an absolute fan of Data Modules and the flexibility that IBM has developed. I make no apology for that. I have been developing them with various clients over the past two years with great success. Many of my clients have now taken on that development baton and continue to develop their own. I realise that a lot of folks out there will be 'traditionalists' and insist that the Data Warehouse should always be in place and central to everything, however, this approach has been developed to give you an alternative to deliver really quick return on your investment.

It would be great to hear your thoughts. Thanks for taking the time to read this article.


#CognosAnalyticswithWatson
2 comments
86 views

Permalink

Comments

Fri October 14, 2022 04:50 AM

Hi Philipp, 
Agree with you on that. Providing the data volume at transactional level is not too high (under 8 million rows) then it's a possibility.

Previously, I have created aggregated data sets to directly support my Dashboards and summarised reporting to ensure they are delivered quickly and efficiently. 

Thanks for responding to my blog.

Regards
Andy

Fri October 14, 2022 04:43 AM

Hi Andrew, Thanks for giving an overview on how to use Cognos Analytics as a mini ETL tool. In extension to that you could divide the data set layer in two data set layers: One persistant staging layer with row data from the data sources to minimise query load on them (only possible if you don't get too many rows). And a second transformation layer to aggregate and calculate persistantly for good analytics query performance.