Persistent Offense Storage in IBM QRadar SIEM
IBM QRadar SIEM processes events by collecting log data from various sources, normalizing it, and correlating the data using built-in rules. When suspicious or abnormal activities are detected, IBM QRadar generates offenses, which are alerts that require investigation. These offenses are stored in the IBM QRadar database and can be viewed in the "Offenses" tab of the user interface. The retention period for offenses varies based on system configuration, but typically, offenses are retained until manually closed or archived, with events stored in the system as per configured offense retention policies. These offenses are stored in the IBM QRadar postgresql database and can only be fetched through IBM QRadar user interface or REST API. Increased number of retained offense subsequently cause the issue with performance of the IBM QRadar. Also the maximum retention which we can have for storing offenses is 2 Years.
Problem Statement:
The current maximum offense retention period in IBM QRadar is set to 2 years. However, customers, particularly those in the banking domain, often request to retain offense data for longer durations due to compliance and business needs.
Storing offenses for extended periods within the PostgreSQL database poses a significant challenge. As offense data grows, it consumes more database space, leading to degraded query performance and overall system slowness. Although offenses can be protected and retained for longer durations, this approach still impacts the active PostgreSQL database, which is critical to IBM QRadar's performance and functionality.
This creates a conflict between the need for long-term data retention and maintaining optimal system performance.
Solution:
To address the challenge of long-term offense data retention without impacting the performance of the PostgreSQL database, offenses can be ingested into the Ariel database. This approach allows historical access to offense data while leveraging Ariel's capability to support longer retention periods. Additionally, this ensures the PostgreSQL database remains optimized for active offense storing and processing.
The solution can be implemented through the following steps:
-
Ingest Offenses using the Universal REST API Protocol
Use the Universal REST API workflow and workflow parameters to extract offenses from the IBM QRadar system for processing and storage in the Ariel database.
-
Create a Log Source using the Universal REST API Protocol
Configure a log source specifically designed to handle offense ingestion into Ariel.
-
Use the storeOffenseInAriel
DSM or Develop a Custom DSM
Leverage the provided DSM (storeOffenseInAriel
) or create a custom DSM tailored to map and store offenses in Ariel effectively.
-
Utilize Saved Searches to Retrieve Offenses and Their Status
Design saved searches in IBM QRadar to access and query historical offense data stored in the Ariel database, ensuring easy retrieval and analysis.
-
Implement Dashboards and Visualizations
Create intuitive dashboards and visualizations to monitor and analyze offense trends and historical data.
-
Consider Further Enhancements
Enhance the solution by automating offense ingestion, refining search capabilities, and integrating with other analytics tools for advanced insights.
This approach ensures offense data retention aligns with customer requirements while maintaining the efficiency and performance of the IBM QRadar platform.
1. Ingestion of Offenses using Universal REST API Protocol:
We will utilize the Universal REST API Workflow and Workflow Parameters to collect offense data directly from the IBM QRadar SIEM. For more details, you can explore the Universal REST API protocol.
Below is the developed workflow designed to retrieve offense details from IBM QRadar and send them back to IBM QRadar in JSON format.
Workflow:
<?xml version="1.0" encoding="UTF-8" ?>
<Workflow name="Ariel" version="1.0" xmlns="http://qradar.ibm.com/UniversalCloudRESTAPI/Workflow/V1">
<Parameters>
<Parameter name="host" label="Host" required="true" />
<Parameter name="auth_token" label="Auth_token" required="true" />
<!--<Parameter name="username" label="Username" required="true" />-->
<!--<Parameter name="password" label="Password" required="true" />-->
</Parameters>
<Actions>
<!--
/////////////////////
// Search for Offenses// ${time()}
/////////////////////
-->
<!-- Initialise lastUpdatedDate to old bookmark value or 0 if no bookmark available otherwise start from lastUpdatedDate from state.json-->
<Initialize path="/lastUpdatedDate" value="0" />
<FormatDate pattern="yyyy-MM-dd'T'HH:mm:ss.SSS'Z'" timeZone="GMT" time="${/lastUpdatedDate}" savePath="/lastUpdatedDateFormated" />
<!-- Initialise firstRunDate to old bookmark value or 0 if no bookmark available otherwise start from lastUpdatedDate from state.json-->
<Initialize path="/firstRunDate" value="${time()}" />
<FormatDate pattern="yyyy-MM-dd'T'HH:mm:ss.SSS'Z'" timeZone="GMT" time="${/firstRunDate}" savePath="/firstRunDateFormated" />
<!-- /totalCount use to store overall pulled events from the discovery -->
<Initialize path="/totalCount" value="0" />
<Initialize path="/totalAttempt" value="0" />
<!-- Set currentTime to current time so that we can read all offense till current time -->
<Set path="/currentTime" value="${time()}" />
<FormatDate pattern="yyyy-MM-dd'T'HH:mm:ss.SSS'Z'" timeZone="GMT" time="${/currentTime}" savePath="/enddateFormated" />
<!-- /count to store pulled events in single run -->
<Set path="/totalCountInThisAttempt" value="0" />
<!-- intiate pages to be used in single run -->
<Set path="/pageOffset" value="0" />
<Set path="/pageSize" value="100" />
<Set path="/count" value="0" />
<Set path="/getOffenses/body" value="0" />
<Log type="INFO" message="Checking events from lastUpdatedDate=${/lastUpdatedDate} and currentTime=${/currentTime}" />
<Log type="INFO" message="Checking events from lastUpdatedDate=${/lastUpdatedDateFormated} and currentTime=${/enddateFormated}" />
<Log type="INFO" message="Events collected before this run is ${/totalCount}" />
<Log type="INFO" message="Total ${/totalAttempt} Attempts has been done so far" />
<DoWhile condition="count(/getOffenses/body) > 0">
<Log type="INFO" message="The value was pageOffset=${/pageOffset} and pageSize=${/pageSize}" />
<CallEndpoint url="https://${/host}/api/siem/offenses?filter=last_persisted_time%3E${/lastUpdatedDate}" method="GET" savePath="/getOffenses">
<SSLConfiguration allowUntrustedServerCertificate="true" />
<BearerAuthentication token="${/auth_token}"/>
<RequestHeader name="SEC" value="${/auth_token}" />
<RequestHeader name="Accept" value="application/json" />
<RequestHeader name="Content-Type" value="text/xml" />
<RequestHeader name="Version" value="22.0" />
<RequestHeader name="Range" value="items=${/pageOffset}-${/pageOffset + /pageSize - 1}" />
</CallEndpoint>
<!-- If API is Fails do further processing -->
<If condition="/getOffenses/status_code != 200">
<Abort reason="${/getOffenses/status_code}: ${/getOffenses/status_message}" />
<Log type="INFO" message="${/getOffenses/status_code}: ${/getOffenses/status_message}" />
</If>
<!-- If API is successful do further processing -->
<ElseIf condition="/getOffenses/status_code = 200">
<If condition="count(/getOffenses/body) > 0">
<!--POST EVENTS-->
<Set path="/count" value="0" />
<Log type="INFO" message="Total Events Collected are ${count(/getOffenses/body)}" />
<ForEach item="/singleOffense" items="/getOffenses/body">
<PostEvent path="/singleOffense" source="${/host}" />
<Set path="/count" value="${/count + 1}" />
</ForEach>
<Log type="INFO" message="Total Events Posted are ${/count}" />
<Set path="/totalCount" value="${/totalCount + /count}" />
<Set path="/totalCountInThisAttempt" value="${/totalCountInThisAttempt + /count}" />
<Set path="/pageOffset" value="${/pageOffset + /pageSize}" />
</If>
<Set path="/totalAttempt" value="${/totalAttempt + 1}" />
</ElseIf>
</DoWhile>
<Log type="INFO" message="Total Events Posted in this run are ${/totalCountInThisAttempt}" />
<Set path="/lastUpdatedDate" value="${/currentTime}" />
</Actions>
<Tests>
<DNSResolutionTest host="${/host}" />
<TCPConnectionTest host="${/host}" />
<HTTPConnectionThroughProxyTest url="https://${/host}" />
</Tests>
</Workflow>
Workflow Parameters:
Here are the workflow parameters required when creating the log source. You need to provide the CONSOLE_IP of IBM QRadar and the AUTH_TOKEN. The AUTH_TOKEN should be an authorized security token with the Admin Security Profile and Admin User Role.
Here, You can know more about Creating an authorized service token for IBM QRadar Operations.
<?xml version="1.0" encoding="UTF-8" ?>
<WorkflowParameterValues xmlns="http://qradar.ibm.com/UniversalCloudRESTAPI/WorkflowParameterValues/V1">
<Value name="host" value="CONSOLE_IP" />
<Value name="auth_token" value="AUTH_TOKEN" />
</WorkflowParameterValues>
2. Create Log Source using Universal REST API Protocol:
Follow the steps below to configure the log source:
-
Navigate to Admin Tab → Log Source Management → New Log Source.
- Select Log Source Type: Universal DSM.
- Select Log Source Protocol: Universal Cloud REST API.
2. Configure the settings:
- Add an Identifier.
- Add the Workflow.
- Provide Workflow Parameters.
- Set Recurrence to 10 minutes ( or you can set this as per your requirement ).
- Set EPS to 5000.
3. Test the protocol parameters to ensure proper configuration.
4. Click Finish and perform Deploy Changes.
5. Wait for the next scheduled run, then verify that events are visible in the Log Activity tab from the log source StoreOffenseInAriel.
3.Use storeOffenseInAriel DSM or Write your own DSM:
- Download and Import the DSM
Or
- Create Your Own DSM
- Open events in the DSM Editor.
- Create a new Log Source Type and name it something like storeOffenseInAriel.
- Navigate to the Configuration option.
- Enable Property Autodetection and set the Property Detection Format to JSON.
- Save the configuration.
- Update Log Source Type
- In the Log Source Management Tab, change the Log Source Type to storeOffenseInAriel and save the log source.
- Trigger Event Processing
- Wait for the next scheduled REST API call or restart the ecs-ec-ingress service to run the process immediately.
- Verify Event Matching and Properties
- Confirm that new events are matching with the log source and that new JSON properties are being auto-generated.
- Allow it to run for an hour, then disable Property Autodetection.
- Map Properties as Needed
- Perform any additional mapping of properties as per your requirements.
Sample payload:
Normalized Information:
3. Utilize a saved search to retrieve offenses and their statuses.
Use the saved search below to query offense data in Ariel for the past 2 hours. This will provide details of all the properties being extracted.
select
DATEFORMAT(LONG("first_persisted_time"),'yyyy-MM-dd h:m:s a') as 'Trigger Time' ,
DATEFORMAT(LONG("start_time"),'yyyy-MM-dd h:m:s a') as 'Start Time' ,
DATEFORMAT(LONG("last_persisted_time"),'yyyy-MM-dd h:m:s a') as 'Last Updated Time',
IF
DATEFORMAT(LONG("close_time"),'yyyy-MM-dd h:m:s a') as 'Close Time',
"Offense_Id" as 'Offense Id' ,
"description" as 'Offense Description',
"Status","assigned_to" as 'Offense Owner',
"offense_type","type","type_id","type_name",
"closing_reason_id" as 'Close Reason',
"closing_user" as 'Closed by',
"domain_id" as 'Domain',
"event_count" as 'Events Attached',
"flow_count" as 'Flows Attached',
"follow_up",
"inactive" as 'Is Active',
"protected" as 'Is Protected',
"source_count","source_network","username_count","rules","log_sources"
from events where Offense_Id!=NULL order by Offense_Id DESC last 2 hours;
4. Index Management:
Navigate to Index Management and enable indexing for Offense_ID and any other properties you plan to use in future searches.
5. Dashboard and Visualization:
You can now create various types of searches to display the data on the Pulse dashboard.
Here is the dashboard I have created and used:
- Offense Table
- Open Offenses
- Closed Offenses
- Owner vs Open Offenses
- Owner vs Closed Offenses
We can create multiple such dashboard to to create dashboards using Pulse Application.
6. Impact on the IBM QRadar:
- API calls to IBM QRadar will experience an increase, subject to the defined threshold.
- This increase will have a minimal impact on your EPS, which will be directly proportional to the number of active offenses within your environment.
- Disk space will be consumed to store payloads and records, leading to a slight increase in overall storage requirements.
7. Further Enhancement:
-
Standalone Application Development
Create a standalone application to track individual offenses and monitor changes in real time. This application should support additional REST API integrations to facilitate data enrichment, retrieve notes, execute Ariel queries for identifying events linked to specific offenses, and perform comprehensive data enhancements.
-
Multi-Tenant Environment Support
Implement support for multi-tenant and multi-domain environments to ensure that analysts can access and search offenses only within their assigned domains in the Ariel database. This approach enforces domain-specific data segregation and enhances security.
This solution is custom-developed and is not officially supported by IBM
If at any point in time, you have any questions or further enhancement, have any comments or want to discuss this further, feel free to get in touch with me.
Vishal Tangadkar – vishal.tangadkar1@ibm.com