TRIRIGA

 View Only
  • 1.  How much testing before rollout?

    Posted Tue May 08, 2018 01:09 PM

    I’m curious if anyone knows how much testing is done before fix packs and the rollout of new modules. As an end-user, it can feel like we are continually stumping the experts!

     

    **
    This question was asked to be posted anonymously by a community member who is unable to publicize their company name and sector. I will email the member directly to notify them of responses.



  • 2.  RE: How much testing before rollout?

    Posted Tue May 08, 2018 05:35 PM

    My experience (over the course of four decades) is that testing varies considerably in both quality and quantity across organizations.  Increasingly, I see a lot of testing effort, as well as large test teams, accomplishing only modest results.  The separation of the ownership of the testing from the responsibility for the code has not always been beneficial to organizations.  One of the goals of DevOps has been to break this dysfunctional cycle. 

    For some context, I used to run a software development company.  We developed code at much faster rates (400-500 lines per day) than is currently achieved (we were agile before it was cool) and, as a company, we had a near zero defect rate for programming/logic errors.  My expectation was that a master programmer should produce almost no coding errors, and that most of our defects were going to be due to inaccuracies in, or misunderstandings of, the requirement and design documents.  This, in fact, were our results.  The quality was due to very high levels of testing performed by the developers (Unit testing) within a culture of high expectations.  

    I think that testing results primarily reflect organizational culture and expectations.  For your enjoyment, I have attached a link to an image that I think you'll either laugh or cry over.  

    The world's most Interesting Man. 

     

    Regards,

    Glen Brumbaugh



  • 3.  RE: How much testing before rollout?

    Posted Wed May 09, 2018 08:58 AM

    This should be based on the per cent of requirements satisfied by the test case suite developed and satisfied for the customer during a problem and tracking review process prior to each delivery. For life critical software 99.9% should be the goal. 
     http://sceweb.sce.uhcl.edu/helm/ROLE-Tester/  This link is thanks to IBM Seed Program!  

     

     Software Engineering for Educational Development (SEED)  One of first SEED members

    http://www.embeddedstar.com/press/content/2002/8/embedded4864.html

       



  • 4.  RE: How much testing before rollout?

    Posted Wed May 09, 2018 10:20 AM

    Hi Kristen and your colleague,

    Here is our experience for IBM Maximo testing with our clients. 

    Where we start - found it has been really useful to have a living document written in business language that describes the core business critical functionality - both base functionality and configuration (and definitely any customization!). Set this up in Excel by functional area  with a lead user + Maximo analyst who own.  Starting point for defining your test cases and focusing your testing (and reference for training materials).  

    For fix packs - Maximo analyst should read the release notes to understand what is changing or being introduced.  We have seen cases with clients where the 'fix' changed functionality that was actually working ok for the company.  The 'fix' produced different results.  Definitely less user testing effort for fixpacks - focus on what is changing or new.  

    For upgrades - Maximo analyst and business lead user should read the upgrade notes.  I think it has worked really well with our clients to use the upgrade as an opportunity for 'whats new' or refresher training.  For testing - test cases  should cover the end to end business processes.  Make sure they also cover user permissions, data restrictions, performance, screen UI layouts, data QA (all of which usually come through 100% clean - but the auditors often ask if these were verified during a systems upgrade).  In terms of effort - would expect each functional area business lead user (not IT!) to spend 1-2 days initial testing after first pass of the upgrade.  Then after second pass of upgrade another 2-5 days doing detail testing to get comfortable with the new version, identify any glitches, and note the changes in the UI or functionality that should be included in the refresher/ whats new training.  

    Hope this helps.  Feel free to post with any specific questions or anything I have missed,

    Richard



  • 5.  RE: How much testing before rollout?

    Posted Sun May 13, 2018 05:24 PM

    HI Kristen and all,
    I'm not a full time professional tester but have run maximo test programs in the past.

    Since IBM are moving away from the big bang major releases to minor upgrades and fix packs releases, I'd like to add that your testing will be impacted by the number of customisations which have been added to your maximo. Some thought with Fix packs and upgrade packages should be given to the possibility that the IBM supplied changes may prompt the removal of one or some of your customisations.

    In the test scenarios - you should obviously focus on the core processes and retest them. How deep you go will depend on the risk of rework issues users might raise as a result of minimal testing.

    Your users should also seek to create re-usable test scripts.

    Finally, what is your business/IT policy regarding IT testing vs User testing of Maximo fix packs and minor upgrades.

    Do you actually have the organisational bandwidth to obtain the time of key users for UAT? Its usual to campaign for, and succeed in getting, users when an upgrade project is established but not so for minor upgrades.

    Hope these thoughts are helpful

    Geoff