Cloud Platform as a Service

Cloud Platform as a Service

Join us to learn more from a community of collaborative experts and IBM Cloud product users to share advice and best practices with peers and stay up to date regarding product enhancements, regional user group meetings, webinars, how-to blogs, and other helpful materials.

 View Only

Creating new index fields from non-JSON logs for improved search efficiency and alert processing

By Marisa Lopez de Silanes posted Tue March 18, 2025 09:53 AM

  

Coauthor: Vanadis Crawford

In IBM Cloud Logs, you can configure parsing rules to control the structure of your data before the data is indexed and used for monitoring and analysis. For example, you can extract selected information, structure unstructured logs, or discard unnecessary parts of the logs. You can mask fields for compliance reasons and fix incorrectly formatted logs. You can also block log data from being ingested based on log content and much more.

Data sources can be configured to send logs to IBM Cloud Logs. Data sent in JSON format is easy to read with its human-readable text and can also be easily searched as a collection of key-value pairs. When IBM Cloud Logs ingests JSON formatted data, IBM Cloud Logs recognizes the JSON format and automatically identifies indexed fields for each JSON key in the log line. However, what can you do when the data is not JSON formatted and is unstructured?

In this blog you will learn how to parse a JSON object that is included in a sample log line as well as other data that is part of an unstructured log line. You will learn how to create new indexed fields from non-JSON logs that can improve search efficiency and help with alert processing.

Before you start

Sample log line

Let's look at a sample log line:

2025-03-17 10:12:18.549006: BIP1234E: Java node error: [com.ibm.abc.logging.WriteSysLogMessages:ERROR_ESQL] App=App1, Flow=Consumer.flow, Error: {\"Environment\":\"PROD\",\"Payload\":{\"CustomLogMessage\":\"Sample message sent successfully\",\"ErrorCode\":\"ABC12345\",\"SystemStatusDetails\":{\"ABC\":{\"Payload\":{\"TraceID\":\"7234566-2a3a-3456-9999-3999as432\",\"ErrorCode\":\"AAA9999\"}}}}

When you send it to IBM Cloud Logs, you get a line as follows:

 

Sending a sample log line

To send a sample log line, complete the following steps from a command line:

  1. Login to IBM Cloud. Run: ibmcloud login –sso

  2. Get a bearer token. Run: export IAM_TOKEN=`ibmcloud iam oauth-tokens --output json | jq -r '.iam_token'`

  3. Run the following cURL command. Makes sure to:

    • Change the date to today's date in the sample log line
    • Add the ingress endpoint of your instance. Tip: You can get the ingress endpoint from the Observability UI.

curl -v --location "https://INGRESS_ENDPOINT_OF_YOUR_INSTANCE/logs/v1/singles" --header "Content-Type: application/json" --header "Authorization: $IAM_TOKEN" --data '[{"applicationName":"myapp","subsystemName":"subsystem1","severity":3,"text":"2025-03-17 10:12:18.549006: BIP1234E: Java node error: [com.ibm.abc.logging.WriteSysLogMessages:ERROR_ESQL] App=App1, Flow=Consumer.flow, Error: {\"Environment\":\"PROD\",\"Payload\":{\"CustomLogMessage\":\"Sample message sent successfully\",\"ErrorCode\":\"ABC12345\",\"SystemStatusDetails\":{\"ABC\":{\"Payload\":{\"TraceID\":\"7234566-2a3a-9999-9999-3999as432\",\"ErrorCode\":\"AAA9999\"}}}}"}]'

You get a 200 HTTP return code if the command completes correctly.

 

If you get a return code of 403, you need to get a new bearer token. Run the command export IAM_TOKEN=`ibmcloud iam oauth-tokens --output json | jq -r '.iam_token'` and try the cURL command again.

Launching the IBM Cloud Logs Observability UI

To create a parsing rule, go to the UI of the Cloud Logs instance where the data is sent.

  1.  Log in to your IBM Cloud account. After you log in, the IBM Cloud UI opens.
  2. Click the Menu icon > Observability to access the Observability dashboard.
  3. Click Logging > Instances. Click the Cloud Logs tab to see your IBM Cloud Logs instances.
  4. Select the instance where you are sending your logs.

Creating a parsing rule

Accessing the IBM Cloud Logs Parsing rules pages

To create a parsing rule, launch the Parsing rules page.

  1. Click the IBM Cloud Logs hamburger icon.

  1. Click Data pipeline > Parsing rules.


The Parsing rules UI opens:


Create a parsing rule

  1. Select Create rule group.

  2. Complete the Details section.  Make sure to:

    • Enter a name. For example: Creating new index fields from non-JSON logs

    • Enter a description. For example:  Parse a JSON object that is included in a sample log line as well as other data that is part of an unstructured log line

 

  1. Complete the Rule Matcher section.

    • Select the Applications, subsystems and severities that match the filtering criteria for logs that include JSON objects as part of the log record. You can select applications, or subsystems, or severities or any combination to define to which log records this rule applies.
  1. Complete the Rules section.

    • First, select the Extract rule to create a new field that includes the JSON object only.

    • Enter a Name. For example, Extract JSON object

    • Select a Source field. In this sample, we use Text, however, if you need to extract the JSON object from a different field in your log record, such as the field log in logs sent by a Kubernetes cluster, you must append the field name to Text (for example, Text.log)  to use that field as the source.

    • Enter a Regular expression. Enter \s*(?P<msg_json>{.*})

    • Click Create the rule.

     

    Before you complete the next steps, do the following so that the field msg_json that contains the JSON object is detected by the service and you can use it to build the second rule:

        1. Save the first rule.
        2. Send a sample log line.
        3. Verify that the new field msg_json is created.
        4. Edit the rule to add the second part.

      • Select AND.  Both rules must be applied on the log line for the new fields to be created.

      • Select the Parse JSON Field rule to create indexed fields for the key-value pairs in the JSON object.
        • Enter a Name. For example, Create fields

        • Select the source field Text.msg_json and choose Merged into to keep a copy of all the fields.

        • As the Destination field, choose Text. The new fields will be created at the root level of the structure. Select Keep to also keep the source field.

      • Make sure the rule is set to ACTIVE.

      • Save the parsing rule. You should see something like

     

    Notice the following about parsing rules:

    • ·       Parsing rules are applied top down as defined in the parsing rules UI. You can drag and drop rules at different positions if you need to apply a rule before others.
    • ·       The rules within a rule group are applied top down. 
    • ·       You can combine rules within a rule group by selecting AND or OR conditions. When you select AND, all rules are tried and applied if the regex applies to that log line. When you select OR, as soon as a rule matches the log line, the rest of the rules are not applied.


    Verifying the parsing rule

    Complete the following steps:

    1. Send 1 or more sample log lines.

    2. In the launch UI, check out the log lines and look for the fields included in the JSON object. See this sample screenshot.

     

    In summary

    In IBM Cloud Logs, you can use parsing rules to manage unstructured data so you can efficiently query the data for troubleshooting, analysis, and configure effective alerts.

    0 comments
    43 views

    Permalink