Hi,
I'd like to throw in Dynamic Cubes as they can rule all your three questions. Depending on what you want to compare and what your underlying database vendor is, I can really recommend Dynamic Cubes if you have a regular and supported database as your source for data. See
https://www.ibm.com/docs/en/cognos-analytics/11.1.0?topic=cubes-verify-system-requirements-cognos-dynamic
- Dynamic Cubes allow incremental data loads per default via a transaction column.
- Dynamic Cubes are designed for handling very large data as they have five cache levels for fast response time + the ability to use pre-aggregated data in the database.
- Hardware requirements can be calculated in Dynamic Cube Designer itself, but they depend on your amount of data records and dimensions structures.
As positive "side effect": users have to be trained a little bit how to get data out of a cube instead of using flat tables with columns. That's a plus point in my opinion as cube hierarchies are much easier to use as flat tables.
If you'd like to know more just ask.
------------------------------
Robert Dostal
Team Leader BI
GEMÜ
Ingelfingen
------------------------------
Original Message:
Sent: Fri March 04, 2022 05:44 AM
From: Philipp Hornung
Subject: Cognos Dataset Evaluation
Hi, I have big doubts that Data Sets are suitable for non-aggregated billion plus records (but can't remember where I read official information about this). I recommend to consider direct database access for this purpose (no matter if via Framework Manager or via Data Modules). Since Data Sets are more or less equivalent to report queries I can't imagine that delta updates are on the IBM roadmap.
------------------------------
Philipp Hornung
Original Message:
Sent: Thu March 03, 2022 06:02 PM
From: Enigmatic Forever
Subject: Cognos Dataset Evaluation
Hi
For an Upcoming project we are evaluating few BI tools in the market and plan is to pick a BI tool with high merits comparing to the other tools. Writing few points comparing Cognos Dataset to its competitors and need assistance with below points
- Is there plan to add incremental load option (with insert, update and delete) in Cognos dataset and if there is any timeframe this option will be added?
- Any thoughts or experience on Cognos Dataset handling large data volumes (billion plus records) for non aggregated data and performance impact?
- Backend Cognos Dataset file (Apache Parquet file) sitting on the Content Store/Cognos server can this be remapped to another windows file directory to reduce the size on cognos server?
Thanks
------------------------------
Enigmatic Forever
------------------------------
#CognosAnalyticswithWatson