Filtering and Enhancing the Exported Dataedit

Your use case might require only a subset of the data exported by Filebeat, or you might need to enhance the exported data (for example, by adding metadata). Filebeat provides a couple of options for filtering and enhancing exported data. You can:

Filtering at the Prospector Leveledit

You can specify filtering options at the prospector level to configure which lines or files are included or excluded in the output. This allows you to specify different filtering criteria for each prospector.

You configure prospector-level filtering in the filebeat.prospectors section of the config file by specifying regular expressions that match the lines you want to include and/or exclude from the output. The supported options are include_lines, exclude_lines, and exclude_files.

For example, you can use the include_lines option to export any lines that start with "ERR" or "WARN":

filebeat.prospectors:
- input_type: log
  paths:
    - /var/log/myapp/*.log
  include_lines: ["^ERR", "^WARN"]

The disadvantage of this approach is that you need to implement a configuration option for each filtering criteria that you need.

See Filebeat configuration options for more information about each option.

Defining Processorsedit

You can define processors in your configuration to process events before they are sent to the configured output.The libbeat library provides processors for:

  • reducing the number of exported fields
  • enhancing events with additional metadata
  • performing additional processing and decoding

Each processor receives an event, applies a defined action to the event, and returns the event. If you define a list of processors, they are executed in the order they are defined in the Filebeat configuration file.

event -> processor 1 -> event1 -> processor 2 -> event2 ...

Drop Event Exampleedit

The following configuration drops all the DEBUG messages.

processors:
 - drop_event:
     when:
        regexp:
           message: "^DBG:"

To drop all the log messages coming from a certain log file:

processors:
 - drop_event:
     when:
        contains:
           source: "test"

Decode JSON Exampleedit

In the following example, the fields exported by Filebeat include a field, inner, whose value is a JSON object encoded as a string:

{ "outer": "value", "inner": "{\"data\": \"value\"}" }

The following configuration decodes the inner JSON object:

filebeat.prospectors:
- paths:
    - input.json
  json.keys_under_root: true

processors:
  - decode_json_fields:
      fields: ["inner"]

output.console.pretty: true

The resulting output looks something like this:

{
  "@timestamp": "2016-12-06T17:38:11.541Z",
  "beat": {
    "hostname": "host.example.com",
    "name": "host.example.com",
    "version": "5.5.2"
  },
  "inner": {
    "data": "value"
  },
  "input_type": "log",
  "offset": 55,
  "outer": "value",
  "source": "input.json",
  "type": "log"
}

See Processors for more information.