Hi John,
There are two options to imports logs from Azure (logdna ) to QROC.
Options 1) At present logDNA is not part of the supported list of devices for Qradar. Since it is part of MS Azure products, We can use Azure event hubs which is supported by Qradar. you may check with Azure support if there is a way to send the required logs to an Azure Event Hub, to be pulled by Qradar.
Options 2)If you have log management system like Splunk or elastic in place then we can first pull logs into those systems via APIs and then forward to Qradar via syslog protocol.
As per my knowledge the option 1 is more feasible as we are currently running QROC (Qradar on cloud) with version v7.3.2 and I can see "Microsoft Azure event hubs " under protocol configuration when I select "Microsoft Azure platform" So, manually installing RPMs is not required (Because we are getting automatic DSM updates from Qradar)
What is Azure event hub : I would like to describe Azure event hub a bit. It is just a place to store your account. When you have your logs configured to be in the Azure event hub, We can configure a log source with that event hub and appropriate credentials. Qradar will then pull the logs from that event hub and give you as events.
We need to be aware of any firewall changes we need to open communication between Qradar and event hub(Event hub and storage account need different TCP ports to be open).
Also We need make sure that in this case i.e. pulling the logs, the target event collector option in the log source configuration will list the event collector which will actually pull the logs.
Below are the 3 parameters we required from azure side and you will find these parameters on Azure portal.
1) Event hub connection string: This includes a) Event hub name space name , b) Event hub name, c) SAS key name, d) SAS key.
2) Consumer Group name : When an Azure Event Hub is created, a $Default consumer group is created. This allows an application using that consumer group to get started and begin reading events.
3) Storage account connection string :This includes a) Storage account name, b) Storage account key.
Where to find these parameters in azure portal, (As discussed the event hub is already created in current azure environment).
A) Where to find Event hub connection string
1) Login to the Microsoft azure portal.
2) Go to "All resources" and then Click on the respective Event hub under entities (which you want to use in this integration with Qradar).
3) On the left pane Click on the shared access policies under settings. (The Event Hub must have at least one Shared Access Signature that is created with Listen Policy).
4) Click on the respective policy, And on the right pane you will see "Connection string primary key".
5) Copy it to the clipboard.
B) Where to find consumer Group.
1) Login to the Microsoft azure portal.
2) Go to "All resources" and then Click on the respective Event hub under settings (which you want to use in this integration with Qradar).
3) On the left pane Click on the "Consumer groups" under entities (We are mainly looking for "Send" policy).
4) Click on the respective consumer group from the list of consumer groups.
5) Copy the name of that consumer group.
C) Where to find storage account connection string
1) Login to the Microsoft azure portal.
2) Click on "Storage accounts" in left pane.
3) Click on "Access keys" under settings and Copy the "Connection string" to clip board
Note: For every Namespace, port 5671 and port 5672 must be open. For every Storage Account, port 443 must be open. So, once we have those parameters in place we will start integration.
I am sharing a video document for your reference.
https://www.youtube.com/watch?v=SylTklpn2ko ,
Let me know if additional information is needed.
Regards
Asif Siddiqui
------------------------------
asif siddiqui
------------------------------
Original Message:
Sent: Mon December 30, 2019 06:41 PM
From: John O'Neill
Subject: QRadar and IBM LogDna
Hi all
I'm working with QRadar at the moment (QROC) and would like to start importing some cloudant logs that we have aggregating using IBMs LogDNA functionality.
I know there is currently a way to import events from LogDNA from azure, but does anyone know how I might go about importing the LogDNA feed into an instance of QROC, there does not seem to be a DSM for it which seems a little odd seeing as they are both on the IBM platform
Many thanks!
John
------------------------------
John O'Neill
------------------------------