”I concentrate on the front-end changes – I don’t deal with the back-end stuff”
There is a misconception that people working solely with the front-end don’t need to worry about how the back-end works.
Maximo is very powerful and it allows developers/administrators to create potentially costly database queries/operations.
In this mini-series I will talk about why SQL tuning is important and what the support team need to know about optimising SQL.
Here are some examples where purely front-end changes have caused performance problems:
Example - The escalation that runs every minute and caused major performance problems
Escalations are a particular concern to experienced support staff because they are designed to query the database regularly and it is possible to write inefficient SQL.
A single escalation scheduled to run every minute over a 45 minute period returned billions of rows and delayed other users from doing their work. This had caused significant problems for the organisation and reworking this escalation provided immediate benefit.
Had the SQL been analysed more before it went live then it would have been possible to identify the impact and change the conditions so that it returned fewer rows.
Example – Users search on an attribute that wasn’t indexed
Attributes were added to the workorder table and then users were advised to search on the attributes.
The aim was to speed up the user’s processes but the attributes weren’t indexed so the queries generated slow full table scans on the workorder table.
When a full table scan occurs the database’s cache will be filled and other queries will be delayed as data is moved in/out of the cache.
This can also happen with saved queries and start centre queries.
Example – report limits
Reports that use Maximo’s built-in where clause can have a limit controlling the number of records that are allowed when the user initiates the report from the list screen. If the limit is too high then the report could use a lot of memory and cause other problems.
It is important to ensure that there are limits in case the user accidentally tries to run it for a large number of records e.g. all the workorders in their list.
Example – storing data into a single field when it should be in separate fields
This example is from the days before I worked on Maximo systems but it is possible that a similar problem could be created in a Maximo system.
An old manual process meant that inventory data was stored in an old system in a single field e.g. “2xHicks 60W light bulbs“. A single field was created in the application that the data was being imported to.
This had several implications:
- When new combinations were created (e.g. 4xHicks 60W light bulb) additional entries needed to be created.
- When users wanted to find out how many Hicks 60W light bulb were in stock they needed to do some inefficient SQL to break up the value and calculate the numbers
- As new values were created there was an ever greater risk of typos that would hide values
When designing a change it is essential to understand how the data is going to be retrieved and used.
In this case it would have been better to break up the value into two fields:
- Quantity – a number e.g. 4
- Item name – text e.g. Hicks 60W light bulb
It would have taken a bit more effort for the data loader to split the data but the short term costs would have been small in comparison to the long term costs of bringing in the consultants to add in new values.
Advice
Ensure that the design/release process considers:
- The frequency that the new/changed functionality will be used
- Will the data need to be used as a condition in a search either by users or automated processes e.g. reports/joins in SQL queries?
- What database resources will be used when the query is executed?
- How to thoroughly test the solution and check the logs for evidence of slow SQL/big result sets
- What the DBAs can to monitor the database for any signs of problems e.g. slow SQL statements ?
#AssetandFacilitiesManagement#Maximo