Thursday, September 25, 2025

How to Use Painless Scripts for Conditional Logic in Ingest Pipelines

In the ELK stack, an ingest pipeline is used to process and transform documents before they are indexed into Elasticsearch. There are several processors available in an ingest pipeline that can be applied to transform documents. To perform transformations effectively, it is sometimes necessary to use conditions. This blog explains how to add conditions in ingest pipelines.

Prerequisites:

  • An Elastic Stack account
  • An ingest pipeline configured in Elasticsearch

When transforming documents, there may be requirements to execute certain processors only if specific conditions are met. In an ingest pipeline, each processor can have a condition written in Painless, Elasticsearch’s scripting language, to achieve the desired transformations. The sample ingest pipeline below demonstrates how to use an if condition with processors.

  • The sample ingest pipeline consists of five processors, as explained below.



1 Rename the field message to event.original.

2 Rename event.original to remove.original.

3 Apply a grok pattern to extract a few new fields.

4 Apply a grok pattern to extract the http.response.content_type field.

5 Rename event.original to remove.original.

  • To demonstrate how processor conditions work, the same processor was duplicated as the second and fifth steps.

  • Below is the full configuration of the Rename processor. This processor renames event.original to remove.original if the specified below condition is met.


    ctx.http?.response?.content_type != null



  • The condition checks whether http.response.content_type is not null. The rename step is applied only if this condition is met.
  • The condition fails in the step 2, because, at that point, the field http.response.content_type does not exist. As a result, the second step is skipped without execution.
  • By the fifth step, the field http.response.content_type has been created by the fourth processor (Grok). Therefore, the condition is met, and the rename step executes successfully.


  • After configuring the pipeline processors as explained above, it can be tested by providing a document. Below is a sample Nginx log used to test the pipeline.

[

  {

    "_source": {

      "message": "10.70.90.122:443 11.90.1.132 - - [26/May/2024:13:08:37 +0000] \"GET https://myapp.abc.com/student/registry/login HTTP/1.1\" 200 38 \"-\" \"Elastic-Heartbeat/8.15.2 (linux; amd64; 202341567932255345; 2024-01-19 09:21:13 +0000 UTC)\" \"-\" \"11.190.4.1:443\" \"0.039\""

    }

  }

]



  • After executing the pipeline, we can see that the http.response.content_type field is created by the fourth step (Grok), and the last step of the pipeline, which renames event.original to remove.original, is also executed successfully.






No comments:

Post a Comment