Cognos Analytics

Cognos Analytics

Connect, learn, and share with thousands of IBM Cognos Analytics users! 

 View Only
Expand all | Collapse all

Managing Multiple Cognos Environments

  • 1.  Managing Multiple Cognos Environments

    Posted Tue November 16, 2021 07:23 PM
    Hi there Community,

    I'm really interested to hear how you're managing you multi environment Cognos setups.

    Cognos has progressed rapidly over the years, becoming much more self service orientated, IMO making it more and more difficult to have the traditional dev, test and prod environments, Back in the day when I worked for the big banks in London, we had a very rigorous development and prod promotion regime. However, these days, in order for Cognos to play catchup with the competitors, IBM have introduced more and more features enabling users to upload their own data, create datasets and pretty much do their own modeling. 

    I'm keen to understand how you're managing keeping you separate environments in sync and controlling what's published in prod  

    Cheers....

    ------------------------------
    Chris Adams
    ------------------------------

    #CognosAnalyticswithWatson


  • 2.  RE: Managing Multiple Cognos Environments

    Posted Wed November 17, 2021 02:37 AM
    Hi Chris,
    from my perspective the answer is really easy: we kept the strict publish path from dev to prod. It's not allowed to upload user-specific data into Cognos Analytics. Every single data source has to be integrated into our Data Warehouse. If someone wants to analyze Excel-based data they usually do that in Excel itself. Almost all data that's needed for data analytics is stored in the DWH anyway.
    I personally don't see any need for uploaded data as this would break the design of "Single-Point-of-Truth" for the whole system. It's still a constant battle with some colleagues who have MS Access and create their own data and measures. IMHO it's the worst-case when people in meetings have to discuss about the right data definition and the quality instead of talking about the data values itself. So as long as I'm in charge for the analytics landscape I won't allow uploaded data ;).

    ------------------------------
    Robert Dostal
    Team Leader BI
    GEMÜ
    Ingelfingen
    ------------------------------



  • 3.  RE: Managing Multiple Cognos Environments

    Posted Wed November 17, 2021 08:42 AM
    Ha! Great post. Once our users saw what they could do the horse was out of the barn and long gone. We have some IT generated datasets and file uploads that go through the traditional dev, test, prod protocol. User generated datasets have exploded and quickly overran the capacity of IT change control. But the intent is to circumvent IT change control. Much of Cognos Analytics work doesn't fit nicely into the dev, test, prod framework because analytics is about learning and that typically involves much try and fail before eventually succeeding.

    We have reached something of a compromise and users do most of their dataset work in the test environment and then submit a change request to move to production. We don't keep the dev environment in sync with test for users generated data sets.

    ------------------------------
    Andrew Newell
    ------------------------------



  • 4.  RE: Managing Multiple Cognos Environments

    Posted Wed November 17, 2021 09:46 AM
    Users always compare the other BI tools; So, we allow excel sheets to create a dataset in a test and QA environment. We recommend moving to the FM model if it is an extensive dataset or updating every day or once a week. Users love to use the small dataset for their data requirements. We move the small dataset reports to production and monitor for six months. If they dont use it, we will purge these reports and datasets.

    Thanks,
    Ram

    ------------------------------
    Ramanujam Rajagopal
    ------------------------------



  • 5.  RE: Managing Multiple Cognos Environments

    Posted Thu November 18, 2021 03:29 AM
    :-D Ramanujam, How did you communicate that you delete unused stuff after six months?

    ------------------------------
    Philipp Hornung
    ------------------------------



  • 6.  RE: Managing Multiple Cognos Environments

    Posted Thu November 18, 2021 09:35 AM
    We are in the process of switching from FM to modules and data sets. We haven't done much with data uploads except for budgets. Users create budget files then up load for actuals vs budget reporting. We considered creating budget tables in the warehouse but uploading straight into Cognos avoided the significant warehouse overhead. 

    We use data sets extensively. Performance for dashboard reporting is far superior to running queries against the warehouse. We have a few data sets created by IT. One is a cube replacement. It's much easier to integrate with other data sources than a cube. We can also integrate with our warehouse security tables for data security. That gives us unified data level security.  A couple data sets are approaching 10 million rows. They summarize the warehouse data and significantly improve report query performance. We considered summarizing in the warehouse but there is so much less overhead with data sets. 

    Users create the majority of our data sets. They pull from the warehouse to build client focused data sets for dashboard reporting that would be impossible against the warehouse without a lot of ETL. 

    The ease of creating data sets and significant report/dashboard query performance gains have mitigated the one version of the truth concerns. Data sets expand how we think about reporting/analytics.

    ------------------------------
    Andrew Newell
    ------------------------------



  • 7.  RE: Managing Multiple Cognos Environments

    Posted Fri November 19, 2021 06:10 AM
    Hi Andrew,

    Just curious, but can you explain a bit more how you 'integrate with our warehouse security tables for data security. That gives us unified data level security.'
    Do users use reports/dashboards with a dataset only approach or are datasets joined/linked to a enterprise IT data module and/or FM package that contains the security?

    ------------------------------
    Thomas van der Meer
    ------------------------------



  • 8.  RE: Managing Multiple Cognos Environments

    Posted Tue November 23, 2021 09:01 AM
    Hi Thomas,

    We have an ETL process that creates a table of user IDs that have access to particular client IDs. We grab the user ID when the user runs a report and determine what clients the user can see. We can use this result set to filter the main fact query. The fact query can either be from database tables or a data sets.

    Andy

    ------------------------------
    Andrew Newell
    ------------------------------



  • 9.  RE: Managing Multiple Cognos Environments

    Posted Tue November 23, 2021 09:55 AM

    Hi Andrew,

    Are you using the security filter in the datamodule for this or are you creating this filter in every report you are creating? And what about dashboards?
    And how does that filter in the report look like? Something like this in every report?
    Query1: select distinct clientID from tablewithUserIDs where userID contains #sq($account.personalInfo.userName)#
    Filter in main query: fact.clientID in ( Query1.clientID)



    ------------------------------
    Thomas van der Meer
    ------------------------------



  • 10.  RE: Managing Multiple Cognos Environments

    Posted Tue November 23, 2021 01:28 PM
    The filter is on the module table, data set or the FM query subject.

    ------------------------------
    Andrew Newell
    ------------------------------



  • 11.  RE: Managing Multiple Cognos Environments

    Posted Mon November 22, 2021 06:39 AM
    I don't understand your statement, "The ease of creating data sets and significant report/dashboard query performance gains have mitigated the one version of the truth concerns." I would think an organization would want that "single version of the truth" in all their reporting and placing the data from the data sets into the warehouse would provide that.

    ------------------------------
    Scott Gaffney
    ------------------------------



  • 12.  RE: Managing Multiple Cognos Environments

    Posted Tue November 23, 2021 09:23 AM

    Hi Scott,

    We rely heavily on client facing dashboards. In order to achieve the performance expected of dashboards we have close to 100 data sets created from queries that filter the main warehouse fact tables for clients or client groups. I suppose we could have created these objects in the database but even then the performance from data sets would be much better. For the most part there's not a lot of logic in the data sets so we're not straying from the one version of the truth. It's small group of power users that create the data sets and we avoid the change control process in the warehouse. 

    Andy



    ------------------------------
    Andrew Newell
    ------------------------------



  • 13.  RE: Managing Multiple Cognos Environments

    Posted Tue November 23, 2021 06:12 AM

    Thanks Andrew. We have been on Cognos Analytics for at least 2 years now and now we are on CA 11.2. We do have some data modules and data sets but we have shyed away from that when we found out we could not implement row level security to the data while using data sets and data modules. So curious to know how you have implemented that. 


    Eby Kuriakose



    ------------------------------
    Eby Kuriakose
    ------------------------------



  • 14.  RE: Managing Multiple Cognos Environments

    Posted Thu November 25, 2021 07:55 AM
    Hi Eby, row level security options are less comparing to Framework Manager but I wouldn't say there are none. What about following approach: Give separate user groups separate Data Source Connections (and Logins) in the Cognos Administration. Each Data Source Connection (Login) has got its own database user. Filter that database user against a corresponding column in the data module ({current_user} = [table].[user group]).

    ------------------------------
    Philipp Hornung
    ------------------------------



  • 15.  RE: Managing Multiple Cognos Environments

    Posted Thu November 25, 2021 03:23 PM
    That should not be necessary. There has been data security aka row-level security since 11.1.

    You access it via the sources slide out.


    You can define filters based on a user, group, or role.

    You can read more about it here:


    https://www.ibm.com/docs/en/cognos-analytics/11.2.0?topic=security-adding-data

    https://www.ibm.com/docs/en/cognos-analytics/11.2.0?topic=sources-securing-data
    sources slide out
    UI for the data security

     expression editor data security filter

    security filter dialog


    ------------------------------
    IAN HENDERSON
    ------------------------------



  • 16.  RE: Managing Multiple Cognos Environments

    Posted Tue November 23, 2021 11:20 AM
    Hi Philipp,

    Sorry for my delayed response.
    I will send the audit report to the users every six months using their model, which used the dataset.

    The audit report SQL from the audit package

    SELECT
    "COGIPF_RUNREPORT"."COGIPF_PACKAGE",
    INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/package['),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/package[') + 16, INSTR(SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/package[') + 16), ']') - 2),
    INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/module['),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/module[') + 15, INSTR(SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/module[') + 15), ']') - 2),
    INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/uploadedFile['),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/uploadedFile[') + 21, INSTR(SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/uploadedFile[') + 21), ']') - 2),
    INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/dataSet2['),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/dataSet2[') + 17, INSTR(SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", INSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", '/dataSet2[') + 17), ']') - 2),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_PACKAGE", 1, LENGTH("COGIPF_RUNREPORT"."COGIPF_PACKAGE")),
    "COGIPF_USERLOGON0"."COGIPF_USERNAME",
    "COGIPF_RUNREPORT"."COGIPF_REPORTNAME",
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 1, 11),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 19, LENGTH("COGIPF_RUNREPORT"."COGIPF_REPORTNAME") - 20),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 1, 7),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 15, LENGTH("COGIPF_RUNREPORT"."COGIPF_REPORTNAME") - 16),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 1, 18),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 26, LENGTH("COGIPF_RUNREPORT"."COGIPF_REPORTNAME") - 27),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 1, 12),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 20, LENGTH("COGIPF_RUNREPORT"."COGIPF_REPORTNAME") - 21),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 1, 6),
    SUBSTR("COGIPF_RUNREPORT"."COGIPF_REPORTNAME", 14, LENGTH("COGIPF_RUNREPORT"."COGIPF_REPORTNAME") - 15),
    "COGIPF_RUNREPORT"."COGIPF_REPORTPATH",
    to_char("COGIPF_RUNREPORT"."COGIPF_LOCALTIMESTAMP", 'mm-dd-yyyy'),
    CASE
    WHEN
    UPPER(SUBSTR("COGIPF_RUNREPORT"."COGIPF_TARGET_TYPE", 1, 1)) IS NULL OR
    LOWER(SUBSTR("COGIPF_RUNREPORT"."COGIPF_TARGET_TYPE", 2, LENGTH("COGIPF_RUNREPORT"."COGIPF_TARGET_TYPE") - 1)) IS NULL
    THEN
    NULL
    ELSE UPPER(SUBSTR("COGIPF_RUNREPORT"."COGIPF_TARGET_TYPE", 1, 1)) || LOWER(SUBSTR("COGIPF_RUNREPORT"."COGIPF_TARGET_TYPE", 2, LENGTH("COGIPF_RUNREPORT"."COGIPF_TARGET_TYPE") - 1))
    END
    FROM
    (
    SELECT
    "COGIPF_USERLOGON"."COGIPF_USERNAME",
    "COGIPF_USERLOGON"."COGIPF_USERID",
    "COGIPF_USERLOGON"."COGIPF_LOG_LEVEL",
    "COGIPF_USERLOGON"."COGIPF_STATUS",
    "COGIPF_USERLOGON"."COGIPF_ERRORDETAILS",
    "COGIPF_USERLOGON"."COGIPF_LOCALTIMESTAMP",
    "COGIPF_USERLOGON"."COGIPF_REQUESTID",
    "COGIPF_USERLOGON"."COGIPF_TIMEZONE_OFFSET",
    "COGIPF_USERLOGON"."COGIPF_SESSIONID",
    "COGIPF_USERLOGON"."COGIPF_LOGON_OPERATION",
    "COGIPF_USERLOGON"."COGIPF_TENANTID"
    FROM
    "COGIPF_USERLOGON"
    WHERE
    "COGIPF_USERLOGON"."COGIPF_LOGON_OPERATION" = 'Logon' AND
    "COGIPF_USERLOGON"."COGIPF_USERNAME" <> 'EMPTY logData'
    ) "COGIPF_USERLOGON0"
    LEFT OUTER JOIN "COGIPF_RUNREPORT" "COGIPF_RUNREPORT"
    ON "COGIPF_USERLOGON0"."COGIPF_SESSIONID" = "COGIPF_RUNREPORT"."COGIPF_SESSIONID"
    WHERE
    "COGIPF_RUNREPORT"."COGIPF_LOCALTIMESTAMP" >= CASE WHEN EXTRACT( DAY FROM TRUNC(CURRENT_DATE) - NUMTODSINTERVAL( EXTRACT( DAY FROM TRUNC(CURRENT_DATE) ), 'DAY' ) + INTERVAL '1' DAY + NUMTOYMINTERVAL( -6, 'MONTH' ) + INTERVAL '1' MONTH - INTERVAL '1' DAY ) < EXTRACT( DAY FROM TRUNC(CURRENT_DATE) ) THEN TRUNC(CURRENT_DATE) - NUMTODSINTERVAL( EXTRACT( DAY FROM TRUNC(CURRENT_DATE) ), 'DAY' ) + INTERVAL '1' DAY + NUMTOYMINTERVAL( -6, 'MONTH' ) + INTERVAL '1' MONTH - INTERVAL '1' DAY ELSE TRUNC(CURRENT_DATE) + NUMTOYMINTERVAL( -6, 'MONTH' ) END

    ------------------------------
    Ramanujam Rajagopal
    ------------------------------