IBM QRadar

 View Only

Tracking Offense Metrics in QRadar

By Cristian Ruvalcaba posted Wed February 08, 2023 01:33 PM

  

Hello Community!

As far back as I can remember, QRadar has had a fairly solid rudimentary 'case management' capability that allowed for tracking a few things:

  • Unique ticket number per case: Offense Number
  • Case ownership: Assigning Offenses
  • Activity tracking: Notes
  • Compiled set of relevant data: Events and Flows, Categories, and more - all available in a summary page
  • Case workflow: Offense Summary for investigation


In fact, the resource is strong enough that when I was using QRadar as part of my daily investigative efforts, I would open tickets in other tools only when action was required by other teams who had no access to QRadar... all while I kept my notes in QRadar's 'Notes' section.

And still, there was one thing missing: decent metrics capabilities around this. I have seen this be the case since the late aughts, and as technologies have evolved and tools like SOAR introduced, some of which have these metrics natively, QRadar continued to chug along without these metrics capabilities natively. QRadar's approach to offense naming and name chaining has likely been just one of the reasons why this functionality would be difficult to implement.

I was recently asked about the ability to search for offenses and filtering down based on the Offense Description. This is important, and can be complex as the API available through Pulse for widget visualization seems to not make even this task, all that simple.

Once a use-case finds its way onto my radar, I sometimes figure out one piece of the puzzle and then can't stop thinking about the final image... I think to myself about the possible level of effort it would take to execute, and... given my motivation in the moment, I might just start hacking away to make something work.

I broke things down for this effort to a few different stages:

  1. Find where the relevant data lives and find a way to get it to the event processing within QRadar
  2. Create a custom DSM to extract the necessary information from the payloads I created
  3. Create the log source that uses the DSM
  4. Create visualizations

While I defined these stages, there are still some sub tasks with each of them... which is where the real magic happens.

As a quick note, and I will reiterate this at the end of this blog post: the material in this post, the approach to what I am doing with this, all of this material... IS NOT SUPPORTED. This is an effort that is NOT supported by IBM Support and there may not be updates to any of the below moving forward... BUT it is possible to take this material and adjust it to meet your needs.

As another quick note, and I will reiterate this at the end of this blog post:
this approach only works on 'on-prem' implementations of QRadar SIEM, which includes virtual instances both locally, and on IaaS providers. This will NOT work for QRoC.



Stage 1: Finding where the data lives

I've been lucky enough to have spent years working with QRadar and having to troubleshoot hundreds of situations... some of that has included understanding the back-end directory structure as well as the relevant data stores. With that, I spent some more time digging through to find the exact data elements that I would use for some high level metrics and once I found them... it was about building the requisite query to pull the data in the way that I needed it presented.

Once I figured that out... it was a question of scripting it in order to get the data into the event processing pipeline. Luckily, I've been building some customs scripts to generate data and send it to the QRadar event pipeline for a while, so it was just a matter of making it work in this specific example... and while most of my data generating scripts were hundreds of lines long, along with if/then and elif/then conditions all over the place... with this one, it was much simpler.

Below, you can find the latest version of the script I put together for querying for the appropriate information and sending to the pipeline for processing:

#!/bin/bash
#Version: 1.1
#Owner: Cristian Ruvalcaba
#Log Model: Custom Log Source - Offenses

#This line below sets up the variables used in the script: buffer file, and EPS at which to send logs into the event pipeline.
tmpFile="/store/scripts/offense_buffer" # Ensure a buffer file exists
EPS=100 # Set EPS limit for log source


## This line below queries the DB and adds it all to a temporary file.
psql -A -U qradar -c "select now(),'data_source=offense_detail.saluca.net',concat('id=',a.id), concat('start=',to_timestamp(a.start_time/1000)) as startTime, concat('offenseCreate=',to_timestamp(a.first_persisted_time/1000)) as offense_created, concat('lastevent=',to_timestamp(a.last_persisted_time/1000)) as LatestEvent, concat('source=',a.source), concat('eventcount=',a.event_count), concat('flowcount=',a.flow_count), concat('offense_name=',a.naming_contributions) as rules, concat('categories=',a.category_naming_contributions), concat('assignee=',b.username), concat('assigntime=',to_timestamp(b.last_persisted_time/1000)) as AssignTime, concat('v=',b.version) as Version, concat('timetoassign=',concat(floor((b.last_persisted_time - a.first_persisted_time)/1000/86400),'d',floor(mod((b.last_persisted_time - a.first_persisted_time)/1000,86400)/3600),':',mod(((b.last_persisted_time - a.first_persisted_time)/60000),60),':',MOD((b.last_persisted_time - a.first_persisted_time)/1000,60))) as timetoassign, concat('secondstoassign=',(b.last_persisted_time - a.first_persisted_time)/1000) as secondsToAssign, concat('timetoclose=',concat(floor((b.closed_date - a.first_persisted_time/1000)/86400),'d',floor(mod((b.closed_date - (a.first_persisted_time/1000)),86400)/3600),':',mod(((b.closed_date - (a.first_persisted_time)/1000))/60,60),':',MOD((b.closed_date - (a.first_persisted_time)/1000),60))) as timetoclose, concat('secondstoclose=',(b.closed_date - (a.first_persisted_time)/1000)) as secondsToclose from offense a join offense_properties b on (a.id = b.id) order by a.id desc" > $tmpFile

/opt/qradar/bin/logrun.pl -t -f $tmpFile $EPS # Send events to SIEM locally using TCP to prevent truncation

# Note: It is possible to generate the file and create a 'log file' model log source for this in lieu of sending via logrun. In order to do that, simply remove the preceding line or comment it out to then follow the log file log source creation process.

What the script is intended to do here is to query the psql database across two tables to get the relevant data points needed to accomplish this effort. The query feeds a buffer file that is then read by the logrun.pl script and sends each line of that buffer file to the event processing pipeline.

I then tested the first version of the script, and it got me what I needed... but I figured I could do more. I adjusted the query, adjusted the script, and kept at it until I got to the script above. Once that was done... it was a matter of running it in cron. My current script is a catch all for offenses, but it may be possible to adjust any of this to fit custom needs, including limiting the results to recently created offenses, recently assigned offenses, etc. Back to the script, I dropped it in my desired directory and created the appropriate cron entry:

*/15 * * * * /store/scripts/offense_logs.sh

Phew! It took a few hours to consider, design, write and refine it, but once it was done, I now had offense details in the event processing pipeline. Note that my implementation is set to run every 15 minutes and this also ties into the queries I define later in the Pulse dashboard.

Stage 2: Creating a custom DSM

There are dozens of blog entries relating to DSMs, dozens of videos I'm aware of online that discuss DSMs and customization, a ton of material about how to build it all out... I've also built a few, so the primary challenge with this effort wasn't so much about the parser writing process, but more with just making sure it's parsed for later use with AQL. So, what fields were necessary? They may have been all listed in the script above, but they are also found below along with the parsing used:

  • offense_id
    • Offense number
    • Parsing: id=(\d+)
    • Capture group: 1
  • offense_assign
    • Offense assigned timestamp
    • Parsing: assigntime=(\d{4}\-\d{2}\-\d{2}\s+\d+\:\d+\:\d+)
    • Capture group: 1
  • offense_assignee
    • Assigned analyst
    • Parsing: assignee=(\w+)
    • Capture group: 1
  • offense_categories
    • Categories of events contributing to the offense
    • Parsing: categories=(.*?)\|assignee
    • Capture group: 1
  • offense_create
    • Create timestamp for offense
    • Parsing: offenseCreate=(\d{4}\-\d{2}\-\d{2}\s+\d+\:\d+\:\d+)
    • Capture group: 1
  • offense_description
    • Naming contributions to the offense
    • Parsing: offense_name=(.*?)\|categories
    • Capture group: 1
  • offense_eventcount
    • Total events related to the offense
    • Parsing: eventcount=(\d+)
    • Capture group: 1
  • offense_flowcount
    • Total flows related to the offense
    • Parsing: eventcount=(\d+)
    • Capture group: 1
  • offense_lastevent
    • Most recent flow or event timestamp for offense
    • Parsing: lastevent=(\d{4}\-\d{2}\-\d{2}\s+\d+\:\d+\:\d+)
    • Capture group: 1
  • offense_seconds_to_assign
    • Total seconds between offense creation and latest assigning, or re-assigning, of offense
    • Parsing: secondstoassign=(\d+)
    • Capture group: 1
  • offense_seconds_to_close
    • Total seconds between offense creation and closing of offense
    • Parsing: secondstoclose=(\d+)
    • Capture group: 1
  • offense_time_to_assign
    • Structured human readable time to assign for offense
    • Parsing: timetoassign=(\d+d\d+\:\d+\:\d+)
    • Capture group: 1
  • offense_time_to_close
    • Structured human readable time to close for offense
    • Parsing: timetoclose=(\d+d\d+\:\d+\:\d+)
    • Capture group: 1


That was it! Well, kind of... The steps for a custom DSM not only include the parsing, but also include event mapping, and in this particular case, also creating the custom log source type along the way. It is possible to map events to out of the box QIDs or to create a custom QID. That's the step I decided to take, a custom QID. As this was a catch all type of script and not nuanced to keep track of all changes, I simply created a single entry.

From there, it was a matter of creating the event mapping to the QID.


Stage 3: Creating the log source

I haven't figured everything out about the DSM editor... in particular, I have yet to fully understand the auto-discovery functionality... so instead, I manually created the log source and ensured a few items were configured:



Note that the screenshots above show a few items to heed:

  1. Extension is used
  2. Events are not coalesced
  3. Log source identifier is 127.0.0.1


Once the log source was created, I was able to deploy changes and ultimately, able to search for and view the events that are coming in so that I could pop open one of those events tied to the log source and manually map the individual payload to the QID I had created. Once that was working, the fields were easily discernible, and the log source was parsing as necessary:

The payload is ready to view below.


The extracted fields are showcased below.




From then, it then becomes about what matters... using the information.

Stage 4: Creating the visualizations

This is where the fun begins... or, well, continues. The goal is to define what visualizations make sense for metrics. Given the fields we are collecting per offense, we are able to track who is the current assignee, open or close status, the description, when the offense was created, assigned and closed. All of this led to defining a few specific visualizations that can help in tracking performance.

  • Average time to assign an offense
  • Average time to close an offense
  • Open offense assignment count per analyst
  • Most recent offenses and their assigned analysts
  • Top offense descriptions
  • Total open unassigned offenses
  • Total open assigned offenses

While the list above is a starting point, drill down functionality within Pulse allows for additional dashboards to be leveraged, amongst others. I did NOT build this list out in Pulse, but it was primarily about jotting down further ideas:

  • Analyst overview where all the visualizations relate to the analyst in question
    • Average time to close offenses
    • Average time to assign to this analyst
    • Top offense descriptions being investigated by the analyst
    • Widgets using other SIEM events, for example audit events
  • Offense description dashboard
    • Total offenses with that description
    • Average event count per offense
    • Average flow count per offense
    • Assigned open offense count per analyst for this description
    • Max event count for a related offense
    • Min event count for a related offense
    • Max flow count for a related offense
    • Min event count for a related offense
    • Most recent offenses with this description
  • Offense overview
    • Offense ID - as a parameter
    • Time to assign
    • Time to close
    • Assigned analyst
    • Total events
    • Total flows
    • More details from the offenses querying using the ID as the parameter
      • Offense source
      • Offense source type


It took some attempts at figuring out how to take the information and have it presented in the right way in a dashboard... One of the AQL queries I put together is listed below. This is the one for the average time to assign. I'll admit, not only did I need to shut down all distractions like music or video that I would have going in the background, but it took me over an hour to figure this one query out:

select

SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.')) as 'Days',
SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.')) as 'Hours' ,
substring(
(double((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600) - double(SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.'))))*60, 0, 2) as 'Minutes',
substring((((double((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600) - double(SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.'))))*60)-double(substring(
(double((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600) - double(SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.'))))*60, 0, 2)))*60,0,2) as 'Seconds',

concat(
SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.')), ' Days ',
SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.')), ' Hours ',
substring(
(double((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600) - double(SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.'))))*60, 0, 2), ' Minutes ',
substring((((double((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600) - double(SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.'))))*60)-double(substring(
(double((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600) - double(SUBSTRING(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400), '.'))))*86400/3600), 0, STRPOS(STR((AVG(offense_seconds_to_assign)/86400 - double(SUBSTRING(STR(AVG(offense_seconds_to_assign)/86400), 0, STRPOS(STR(AVG(offense_seconds_to_assign)/86400),'.'))))*86400/3600), '.'))))*60, 0, 2)))*60,0,2), ' Seconds '
) as "Time To Assign"

from events where logsourcename(logsourceid) ilike '%offense detail%' and offense_seconds_to_assign > '0' and offense_assignee is not null last 15 minutes


Once that query was out of the way, I was able to start building more of the visualizations, ultimately landing at the dashboard shown below:


Now that this is out of the way, the next steps are those additional dashboards. Again, I haven't built them out as of publishing this post.

Using the dashboard:

The first thing a shift lead might take a look at would be the total unassigned offenses and start assigning those... and instead of having to look for the little humanoid figure on the offense list, it's a matter of just looking at the dashboard, and if there are no offenses unassigned, there's no need to jump in and assign first thing, for example.

All in all, this is a useful approach for anyone that is limited in what tools they have available to them. If it's only QRadar SIEM and there's a willingness to explore just how flexible the platform is, even if it's not officially a supported approach, then it's possible to tackle incident response metrics in this way.

As a quick note, and I will reiterate this at the end of this blog post: the material in this post, the approach to what I am doing with this, all of this material... IS NOT SUPPORTED. This is an effort that is NOT supported by IBM Support and there may not be updates to any of the below moving forward... BUT it is possible to take this material and adjust it to meet your needs.

As another quick note, and I will reiterate this at the end of this blog post:
this approach only works on 'on-prem' implementations of QRadar SIEM, which includes virtual instances both locally, and on IaaS providers. This will NOT work for QRoC.


Downloads of DSM and download Dashboard.

#Highlights-home
#Highlights

9 comments
425 views

Permalink

Comments

Hai Cristian. How to add more kolom like domain_id in psql?

Tue April 02, 2024 03:43 PM

I already create Log Source

But no event with this Log Source

Wed March 27, 2024 12:40 PM

Alright everyone- here's the download. Sorry about the confusion! The link in the blog is taken care of as well.

Mon March 25, 2024 08:22 PM

Hello those here! My apologies on the link failing... This was brought to my attention over the weekend, well, I finally caught a message over the weekend. I will see if I can coordinate with Jose so he can add these files to my directory under his file structure. I will update here when I get that figured out, I'll also update the link in the article.

Mon March 25, 2024 02:27 PM

@Cristian Ruvalcaba @Cristian Ruvalcaba Can you please update the download links so these files are accessible again?

Wed March 20, 2024 05:47 AM

Hi @Cristian Ruvalcaba @Cristian Ruvalcaba

Very interesting, could you please provide a valid download link?

Thank you in advance

Wed April 05, 2023 12:19 PM

Sorry about the issues with the download link... I've struggled to find a method to make it more permanent, I believe I just figured that out: https://ibm.box.com/v/qradarmetrics

You should be able to find the materials there.

Fri March 24, 2023 12:20 PM

Hi @Cristian Ruvalcaba 

Can you please provide the links us.

Tue March 14, 2023 01:40 AM

Hey mate,

Why has the Downloads been removed? Can you please provide the links