Upgrading to Cognos Analytics
Every process can be broken down to a methodology. After having help many customers upgrade to Cognos Analytics, I have established a repeatable process. As anecdotal as it may sound, I will include some specific facts that may help you in your journey. The steps include
- Planning the Upgrade
- Environment Setup
- Establishing a Baseline
- Content Migration
- Validation and Remediation
- Go Live
This session will cover the Baseline phase.
The success of an upgrade can be measured based on the execution of a test plan. Building a test plan consists of 2 critical steps.
- What reports need testing
- What scenarios need testing
Determining what reports need to be tested should be based on an agreement with the user community and gaining their confidence that sufficient testing confirms that their reports will work in the new environment. Does that mean 100% of their reports need testing or is there a sample of reports that would suffice?
When you collect an inventory of reports you need to make a few decisions.
- Are there duplicates that can be eliminated?
- Are there reports that can be optimized or modernized? Perhaps these are opportunities for another project. It is recommended to keep the upgrade simple and leave this type of effort for a different project.
- Are all these reports still needed? Consider archiving reports that are not needed then if they are requested later you can retrieve them from the archive.
- Do all these reports still work?
Some discussion needs to occur on the contents of the reports in the “My Folder” as opposed to those in the “Public Folder”. An argument could be made that anything in the “My Folder” should remain the responsibility of the owner and anything that was placed in a shared folder could be considered for the overall test plan. In the end there are restrictions to accessing reports in “My Folder” which could edict their exclusion from the overall test plan.
Determining how the reports should be tested includes a few more decisions.
- Do the reports have multiple use cases that require different tests?
- Do you test against production data or is there a testing/staging database that could be used?
- Do you test on your production Cognos environment or should you stand up a sandbox environment for testing?
- Is the data frozen such that data refreshes or reports with relative period requirements skew the results? This is more relevant on when the target execution occurs and for the output comparison, more on that in a future blog.
- Do you need to test with different security roles?
- Do you have the parameter values for reports with required prompts?
In the end a good test plan will define what needs to be tested and how to test those reports.
Testing the at the source is the beginning. Here source is defined as the current environment. After you have defined the plan of what reports need to be tested and what scenarios to test you establish a baseline. The baseline would include all the test cases to satisfy the end user test requirements and generated output for each of those test cases in the source environment. That output would be used during the validation phase. Basically, the validation phase would use the same test cases but in the target environment and then compare the two outputs. Here target is defined as the new environment/version that we are targeting the upgrade for. More on that in a future blog. The goal of the baseline is to make sure the reports work before migration occurs for each of the test cases defined.
Manual or Automated Testing
How do you execute all those test cases? Depending on how many reports and test cases your plan includes it might be possible to complete the testing using a manual approach. Scalability and reuse for regression testing would be an obvious value in using automated testing tools. Regression testing could be considered extremely valuable particularly with the frequency of releases of Cognos Analytics. Consideration for the resources required to complete the testing does need to be taken into account. If a manual approach is used, how long will it take to execute the test plans? In contrast if you use an automated tool what is the resource load on the environment being used so that the production environment is not impacted, unless you planed on using a sandbox environment and a staging/test database. Additionally, most automated test tools will not test javscript as they would only validate the data or the rendered output.
When looking at the timeline for testing both the source and target you need to consider the impact of change requirements. In most environments even if you have a solid change management process it might be challenging to enforce a code freeze for the duration. Tracking changes such that retesting can occur is either a manual process or some testing tools also have the ability to track stale reports. It can be frustrating to test a source report that has changed but not migrated to the target environment or to compare the output of a report that has since changed.
Knowing what and how to test before migration is important. Fix, delete or archive in the source environment before migration. Generate the output of those test cases for use in the validation phase.
More on what to expect during a migration in a future blog, stay tuned.