Logstash Integration Pluginedit

  • Plugin version: v1.0.2
  • Released on: 2024-01-23
  • Changelog

For other versions, see the Versioned plugin docs.

Getting Helpedit

For questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.


The Logstash Integration Plugin provides integrated plugins for sending events from one Logstash to another instance(s):

High-level conceptsedit

You can configure a logstash output to send events to one or more logstash inputs, which are each in another pipeline that is running in different processes or on a different host.

To do so, you should first configure the downstream pipeline with a logstash input plugin, bound to an available port so that it can listen for inbound connections. Security is enabled by default, so you will need to either provide identity material or disable SSL.

You will need a TCP route from the upstream pipeline to the interface that the downstream pipeline is bound to.

input {
  logstash {
    port => 9800

    ssl_keystore_path      => "/path/to/identity.p12"
    ssl_keystore_password  => "${SSL_IDENTITY_PASSWORD}"

Identity material typically should include identity claims about the hostnames and ip addresses that will be used by upstream output plugins.

Once the downstream pipeline is configured and running, you may send events from any number of upstream pipelines by adding a logstash output plugin that points to the downstream input. You may need to configure SSL to trust the certificates presented by the downstream input plugin.

output {
  logstash {
    hosts => ["", ""]

    # SSL TRUST 
    ssl_truststore_path => "/path/to/truststore.p12"
    ssl_truststore_password => "${SSL_TRUST_PASSWORD}"

Unless SSL is disabled or the downstream input is expected to present certificates signed by globally-trusted authorities, you will likely need to provide a source-of-trust.

Load Balancingedit

When a logstash output is configured to send to multiple hosts, it distributes events in batches to all of those downstream hosts fairly, favoring those without recent errors. This increases the likelihood of each batch being routed to a downstream that is up and has capacity to receive events.