IBM webMethods Hybrid Integration

 View Only

 Audit logging resource intensive and how do I poll the audit db?

John Hawkins's profile image
John Hawkins IBM Champion posted Wed August 27, 2025 10:43 AM

Hi Folks,

I need to track parts of a document e.g. JUST the orderId and customerId. Audit logging looks like the right way to go. However, how resource intensive is it for a busy service? Is there any other better/less intensive way I could track documents without using it ?

Part two of this question is around retrieving the events from the audit DB. I can see the events in the DB but it looks like the data is a blob. How can I retrieve that data? I'm currently subscribing to events using the events:addSubscriber service - but I'd rather poll it the DB if I could. Has anyone ever read the audit DB directly?

many thanks for your help !

John Carter's profile image
John Carter

Are you saving the pipeline or using logged fields. If you save the pipeline then it will store it as a blob. This is both costly and of no use to you, as you cannot use it for tracking. 
if you use logged fields then it will simply save a key value pair for each attribute you checked. Cost is minimal and you can use this table to the find the transactions that you want. 

also don’t forget the service setContextID, which also allows you to set an explicit business identifier for your transaction and all subsequent steps. 

Rupinder Singh's profile image
Rupinder Singh

You have kind of answered your own questions. Audit DB isn't ideally suited for a production setup where you want to track data from a high volume call. It has never performed really well for the reasons you mentioned. There are a couple of alternatives:

1. Write your own  service to write to a store of your choice. You can call the service asynchronously or, if you need throttling, you can publish a doc and then call the service from the trigger.

2. Modern systems do this using OpenTelemetry traces and metrics. So if you have an APM tool, then you can go that route. I have done that at some customers using our OpenTelemetry package called Otelscope. But that is only relevant if you have an APM tool or are planning to use one.

 

John Carter's profile image
John Carter

@Rupinder Singh is of course correct. If you

simply to be able to filter or perform some

light weight reporting, my solution will work. 
for large volumes or more complex tracking then better to use the proper tooling but of course that is a major investment and maybe beyond the he scope of your project. 

John Hawkins's profile image
John Hawkins IBM Champion

Thanks Guys - perfect answers :-) I don't need the pipeline so it sounds like auditing should be fine - thanks for your help !

John Hawkins's profile image
John Hawkins IBM Champion

HI Again- spoke to soon :-( I've stopped saving the pipeline which was where I was getting my audit data from - which table is the logged fields stored in?

thanks again !

Dave Laycock's profile image
Dave Laycock

You can see the logged field values in the WMSERVICECUSTOMFLDS table.

To get the service context, you can join that table to WMSERVICE using the CONTEXTID columns in each table. Note that WMSERVICE gets two rows per service invocation. You need to filter the STATUS column to narrow it down to just the service start or end rows.

John Hawkins's profile image
John Hawkins IBM Champion

Sorry about this - I've just rounding out on this...

is the package name in the db anywhere ? I've looked everywhere but can't see it. I get the impression that the wM gets it from the runtime context rather than the audit logs?

many thanks !

john.