Cognos Analytics

Cognos Analytics

Connect, learn, and share with thousands of IBM Cognos Analytics users! 

 View Only
  • 1.  Interactive Report Execution Time Limit

    Posted Thu October 09, 2025 09:03 AM
    I'd like to query the community to see if anyone has come up with a solution regarding users executing Detail reports that cover excessive timeframes (such as an entire month of detail data vs summary data) that results in millions of rows returned.  Ideally there would be a way to break up a large resultset into smaller chunks that are executed sequentially. I'd like to do this as transparently as possible.  

    --

    Dax Lawless

    Cslt III Spec-PS Consulting
    Consulting & Advisory Services
    Verizon Business Group

    M 3195382382



  • 2.  RE: Interactive Report Execution Time Limit

    Posted Thu October 09, 2025 09:12 AM

    In general, building reports that show hundred thousands of lines of data is never a good idea. It's much better to show that data grouped or in a graph. That's my quick thought on the matter.



    ------------------------------
    Joeri Willems
    ------------------------------



  • 3.  RE: Interactive Report Execution Time Limit

    Posted Fri October 10, 2025 04:24 AM
    Edited by jonathan chesterton Fri October 10, 2025 04:25 AM

    I think a lot will also depend on what your users are expecting to use the data for and the reason for doing it. Are they just grabbing large volumes of data so they can put into Excel? if so, then maybe your BI solutions aren't providing the right answers or your users don't know how to use it!

    I always try to push for less is best when using a BI tool to provide a solution. The idea is usually to provide an answer or to highlight/alert something. 

    Filter those reports, filter those dashboards, provide high level summary reports and dashboards that let your users filter for the answers they want and then if need drill through to the detail that has already been constrained by the earlier reports

    Running large interactive reports is never a great idea and if they must be run, they are best run as a background request. 



    ------------------------------
    Jonathan Chesterton
    NHS Supply Chain
    ------------------------------



  • 4.  RE: Interactive Report Execution Time Limit

    Posted Fri October 10, 2025 02:23 AM

    Let's say you have had the discussion with the users and after all the reasons not to do this - they are still insistent on this as a requirement, and that they must have a report with a million rows in it.

    There is obviously no point in having the report interactively - what are they going to do with a million rows of data within a browser? By the time they have finished reading a million rows the data will be out of date. I would argue the same for PDF - what would you do with a million rows of data in a PDF?

    Which sort of leaves either Excel or one of the other data formats such as CSV.

    Within a single report you could parallel run multiple queries - maybe split them up by master detail. But this would only work in Excel format. The other formats just run a single query in the report.

    And after the report has finished parallel running the queries there is the overhead of assembling the extracted data in to as single excel file which can't be done parallel.

    Whilst Cognos can do this requirement - I would say go back to basics, what exactly is the requirement. Is it a report or data extraction? if the latter then look at just running extracts directly against the database.



    ------------------------------
    Marc Reed
    Reporting Lead
    ------------------------------



  • 5.  RE: Interactive Report Execution Time Limit

    Posted Fri October 10, 2025 04:34 AM

    I can only give Kudos to @Marc Reed and @jonathan chesterton. That's the same approach I use for those requirements in my CA environment.

    Getting your users to know is critical. Millions of records extracted from a system is never a good idea and leads me to the question: Do the users trust the data?

    I've experience questionable data exports a lot and usually the user wanted it that way because they needed to check the data manually – which shouldn't be the task if the goal is to make decisions based on data. Therefore I build all detailed reports bonded with conditional blocks. There's a variable in place that counts the records of the result set of the query. If that amount exceeds a certain number (say 65.000-100.000) the block will only display a message saying: no data for you today, please set more filters.

    Detail reports have their place and usage in a BI tool but only if a lower number of detail records gets displayed. If possible, link summary reports to detail reports with drill-throughs. The user can select a certain data point in the summary report and can jump to the detail report which gets automatically filtered with the same criteria as the summary report. That should shrink the data volume.



    ------------------------------
    Robert Dostal
    Principal Expert BI
    GEMÜ Gebr. Müller Apparatebau GmbH & Co. KG
    Kupferzell
    ------------------------------



  • 6.  RE: Interactive Report Execution Time Limit

    Posted Mon October 13, 2025 09:11 AM

    Unfortunately, having "technical limitation" conversations with individual users might work when dealing with an internal organization, there is still no solution to the problem that would prevent any particular user from breaking a system, as ours is intended as a multi-user/multi-organizational system of external users.  Documenting "best practices" doesn't prevent bad user behavior.

    @Marc Reed:

    In this particular situation, the report in question is intended to be run as a batch report via a schedule that has relatively fixed parameters that results in a csv/excel file that is delivered on the back end via MFT. 

    However, this particular user wanted the same detail data that unfortunately resulted in a massive data dump due to the availability of the scope of the prompts in question.   

    This impacted the disk read/write performance (due to file caching), and overall system memory usage, which in turn impacted the experience of other users.

    As far as solutions are concerned, I was looking for a technical solution that would cancel the report run if it exceeded a certain runtime -- normally this is set via the tuning setting "Queue time limit of report service (seconds)".  However, apparently that setting doesn't work as described, as this report ran (interactively) for several hours.  It is set to something like 240 seconds. 

    If there are specifics to the tuning setting that aren't generally available, I'd like to know.

    Unfortunately, our system has hundreds of basically what amount to unique reports that makes re-development time-consuming/problematic.  




    ------------------------------
    Dax Lawless
    ------------------------------



  • 7.  RE: Interactive Report Execution Time Limit

    Posted Mon October 13, 2025 09:18 AM

    @Dax Lawless,
    if you're looking for a setting to avoid long queries, have a lokk at "qs.queryExecution.queryTimeout" which I have set in my environment. Maybe this helps.

    Query service advanced settings - IBM Documentation

    Chers
    Robert



    ------------------------------
    Robert Dostal
    Principal Expert BI
    GEMÜ Gebr. Müller Apparatebau GmbH & Co. KG
    Kupferzell
    ------------------------------



  • 8.  RE: Interactive Report Execution Time Limit

    Posted Tue October 14, 2025 03:19 PM

    Dax, I have had to do something similar in the past and my solution was to implement query row and/or execution time limits in Cognos, but the better option was to have the DBA implement limits at data source.  The former option takes effect whenever the number of rows processed or the given timeout is reached.  The latter occurs quicker as the source database makes the determination before returning the data to Cognos.  Although, the former option is more user friendly as the user is presented with a message indicating that the row/time threshold was reached although there may be some frustration with waiting for the results only to be stopped.

    I concur with the others...what is the use case?  No analyst is going to sift through millions of records.  They are likely using the output as a source for something else.  Determine what they are trying to accomplish and help them from a BI perspective, they will appreciate it in the end, otherwise, have them make a data dump request of the DBAs exporting the data in a csv or some other text format.



    ------------------------------
    Dion Paul
    Cognos Administrator
    Ascend SC
    Daytona Beach FL
    dpaul@scgts.com
    ------------------------------