Logstash Lines: Offline plugin management, multiple pipelines.

Welcome back to The Logstash Lines and a happy new year to our readers! In these weekly posts, we'll share the latest happenings in the world of Logstash and its ecosystem.

Fixes in 5.1.2 branch

Ability to remove tags field from output was not working post 5.0.x. This was a regression introduced by a previous bug fix which went into 5.0.1 and 5.1.0

Fixes in 5.2.0 branch

  • Improved accuracy of Persistent Queue feature's documentation.
  • Logstash slowlog field took_in_millis shows microseconds instead of milliseconds
  • Added execution time stats for each output in the pipeline. We now store the wall clock time for each output which is available in duration_in_millis field using the API http://localhost:9600/_node/stats

Improvements to offline plugin management (5.2.0 branch)

Offline plugin management has been a pain point for users ever since this feature was introduced. In the existing workflow, the complete set of plugin dependencies weren't downloaded when the offline pack was created on the staging machine. Now we've simplified the workflow and also fixed the bugs. On a staging machine, we now traverse the entire dependency graph to download all the dependencies, package the plugins in an archive which can directly be installed on the LS deployment with no internet access. The original bin/logstash-plugin pack/unpack commands have been deprecated in favor of bin/logstash-plugin prepare-offline-pack. Once such a pack is created, there is no need to use a special unpack command. Users can simply install plugins using

bin/logstash-plugin install file:///<path_to_offline_pack.zip>

Multiple Pipelines (In progress, 5.3.0 branch)

Most production Logstash configs can been complex with multiple data processing flows created using conditional logic. One experimental feature we're targeting for 5.3.0 is running multiple, separate pipelines in the same LS instance (same JVM). This could simplify LS config and also make it similar to how an ingest node can run multiple ingest pipelines. Stay tuned.

  • We're hiring

    Work for a global, distributed team where finding someone like you is just a Zoom meeting away. Flexible work with impact? Development opportunities from the start?