
Introduction
Db2’s audit facility is a cornerstone for compliance and security monitoring. It captures critical events such as authentication attempts, privilege changes, and data access, enabling organizations to meet regulatory requirements and detect anomalies. However, the raw audit logs generated by Db2 are often in binary or DEL (comma-separated) format, making them difficult to interpret without additional processing.
This guide provides a step-by-step walkthrough for extracting and converting Db2 audit data into a human-readable format, ready for analysis in Excel or BI tools. Whether you’re dealing with binary logs or DEL files, this practical approach will help you streamline compliance reporting and security reviews.
1. Understanding Db2 Audit Records
Db2 audit logs contain detailed records of database activities, organized into categories such as:
-
- AUDIT– audit settings modified or accessed.
- EXECUTE – log SQL executions.
- CHECKING – authorization checking of attempts to access or manipulate Db2 database objects or functions.
- OBJMAINT – creating or dropping data objects, and when altering certain objects.
- CONTEXT – to show the operation context when a database operation is performed.
- SECMAINT – various security-related events such as privilege manipulation, altering authorization, etc.
- SYSADMIN – operations requiring SYSADM, SYSMAINT, or SYSCTRL authority are performed.
- VALIDATE – authenticating users or retrieving system security information.
Audit records may be stored in:
-
- Binary format – Compact, efficient, but not human-readable.
- DEL format – Text-based, comma-delimiter-or-separated, easier to process where the delimiter is a comma
For detailed column layouts per category, refer to IBM’s documentation: Audit Facility Record Layouts.
2. Extracting Binary Audit Files
Binary audit files must be converted before analysis. Here’s how:
Step 1: Download and Transfer
Download the binary audit file from the source system and copy it to a Db2 server where db2audit can run.
⚠️ Important: The Db2 server is not the same as Db2 Warehouse as a Service. Ensure you transfer the file to the correct server before proceeding.
Step 2: Run Extraction Command
Use one of the following commands:
Option 1: Stored Procedure
CALL SYSPROC. SYSPROC.AUDIT_DELIM_EXTRACT(',' ,'/user/temp/db2/audit/preview-logs', '/user/temp/db2/audit/preview-logs/db2audit.db.BLUDB.log.0.20260216011848344893', NULL);
Option 2: Command Line
db2audit extract delasc delimiter , to <workingdir> from files <fullpathactivelog>
Step 3: Result
You’ll get a human-readable DEL file containing audit records.
Example:

3. Preparing DEL Files for Analysis
DEL files often lack headers, making them hard to interpret. Column definitions are stored in ~/sqllib/misc/db2audit.ddl
Before importing into Excel or BI tools:
- Inject headers from the DDL file.
- Convert DEL files into CSV format.
If your Db2 audit logs are stored in IBM Cloud Object Storage, you can automate downloads for specific time windows instead of pulling everything manually.
Here’s the core logic:
Db2AuditS3Downloader Class
This script connects to an s3 bucket instance and downloads audit logs within a given time range.
import os, re, ibm_boto3 from datetime import datetime
class Db2AuditS3Downloader: """Minimal COS downloader for Db2 .DEL files with time-based filtering."""
def __init__(self, bucket, prefix="", local_dir="del_files", **creds):
self.bucket = ibm_boto3.resource(
"s3",
endpoint_url=creds["endpoint"],
aws_access_key_id=creds["key"],
aws_secret_access_key=creds["secret"],
region_name=creds.get("region"),
).Bucket(bucket)
self.prefix, self.local_dir = prefix.rstrip("/"), local_dir
os.makedirs(local_dir, exist_ok=True)
def _parse_ts(self, fname):
"""Extract Db2 timestamp (e.g. 0.20251112193803924223)."""
m = re.search(r"\.0\.(\d{20})", fname)
return datetime.strptime(m.group(1), "%Y%m%d%H%M%S%f") if m else None
def _download(self, key):
path = os.path.join(self.local_dir, os.path.basename(key))
self.bucket.download_file(key, path)
print(f"✅ {path}")
def download_range(self, start=None, end=None):
"""Download all .DEL files in given datetime range."""
if isinstance(start, str): start = datetime.fromisoformat(start)
if isinstance(end, str): end = datetime.fromisoformat(end)
for obj in self.bucket.objects.filter(Prefix=self.prefix):
if not obj.key.endswith(".del"): continue
ts = self._parse_ts(obj.key)
if not ts: continue
if ((not start or ts >= start) and (not end or ts <= end)):
self._download(obj.key)
Usage
if name == "main":
creds = {
"endpoint": "https://s3.us.cloud-object-storage.appdomain.cloud",
"key": "YOUR_ACCESS_KEY",
"secret": "YOUR_SECRET_KEY",
"region": "us-standard"
}
Db2AuditS3Downloader("my-bucket", prefix="db2audit", **creds) .download_range("2025-11-12T00:00:00", "2025-11-12T23:59:59")
4. Automating Header Injection and CSV Conversion
When dealing with Db2 .DEL audit exports, it’s tedious to manually attach headers or rename files.
Here’s the core of how to automate the conversion by dynamically parsing table definitions from a DDL file and mapping them to the .DEL logs.
Db2AuditConverter Class
This script:
- Parses the DDL file to extract column names.
- Converts .DEL files into .csv with headers.
- Logs progress and errors.
import os, re, csv from datetime import datetime
class Db2AuditConverter: """Automates conversion of Db2 .DEL audit files to CSV with headers from DDL."""
def __init__(self, ddl_file, del_dir=".", out_dir="csv_output"):
self.ddl_file, self.del_dir, self.out_dir = ddl_file, del_dir, out_dir
os.makedirs(out_dir, exist_ok=True)
def parse_ddl(self):
"""Return {TABLE: [COLUMNS]} from DDL."""
ddl = open(self.ddl_file, encoding="utf-8").read()
tables = re.findall(r"CREATE TABLE\s+(\w+)\s*\((.*?)\)\s*ORGANIZE BY ROW",
ddl, flags=re.S | re.I)
return {t.upper(): re.findall(r"(\w+)\s+\w+", body) for t, body in tables}
def convert_table(self, table, cols):
"""Find matching .DEL files and convert them to CSV with headers."""
pattern = re.compile(rf"db2audit\.db\.BLUDB\.log\..*\.{table}\.del$", re.I)
for fname in os.listdir(self.del_dir):
if not pattern.match(fname): continue
src = os.path.join(self.del_dir, fname)
dst = os.path.join(self.out_dir, fname.replace(".del", ".csv"))
with open(src, encoding="utf-8", errors="ignore") as f:
lines = [l.strip() for l in f if l.strip()]
with open(dst, "w", newline="", encoding="utf-8") as out:
w = csv.writer(out)
if not any(c in lines[0] for c in cols[:3]): # Add headers if missing
w.writerow(cols)
for line in lines:
w.writerow(line.split(","))
print(f"✅ {dst}")
def process_all(self):
for table, cols in self.parse_ddl().items():
self.convert_table(table, cols)
Usage
if name == "main": Db2AuditConverter("db2audit.ddl.txt", del_dir="del_files").process_all()
5. Post-Processing and Analysis
Once converted:
-
Import CSV files into Excel, Power BI, or pandas for analysis.
-
Filter by category (e.g., SECADM for security changes).
-
Build dashboards for compliance reporting.
Summary
Modernizing how organizations handle database audit data can significantly streamline compliance and enhance security visibility. By extracting binary audit logs using db2audit, administrators can convert raw log data into structured, analyzable formats.
The next step—injecting column headers from the db2audit.ddl definition—adds critical context to the extracted data, ensuring accuracy and consistency. Converting DEL files to CSV with a lightweight Python script then enables seamless integration with common analysis platforms like Excel or business intelligence tools. This workflow not only simplifies compliance audits but also empowers teams to proactively monitor security events and minimizes manual errors in data preparation—transforming a traditionally tedious process into a scalable, automated, and audit-ready pipeline.
About the Author
Brigitte DeLoren is a 13-year software engineer who thrives at the intersection of cloud technology, code craftsmanship, and team leadership. Known for elevating engineering quality and demystifying complex systems, she has built expertise in Db2 auditing while also shaping teams through clear technical direction and documentation. This marks her debut into technical blogging, where she brings her hands-on experience to a broader audience.