Product release

What's new in Logstash 7.10.0

We are happy to announce the general availability of the Logstash 7.10.0 release. This is the latest stable release and is now available for download! Please refer to the release notes for the complete list of features and bug fixes.

Logstash artifacts now ship with bundled JDK

For many years now, Logstash users would need to install a Java Development Kit (JDK) before being able to successfully install and run Logstash. This was an extra step to get started and we wanted to make this experience even easier, especially for new users. With the Logstash 7.10.0 release, we are now offering new architecture-specific download and installation options that are bundled directly with a JDK. The JDK version bundled is AdoptOpenJDK 11, which is the latest long term support (LTS) release.

For new users, JDK installation is no longer required to run Logstash as the bundled JDK version will be used by default. For existing users, if you have JAVA_HOME set to use a custom JDK, Logstash will continue to use the JDK version you have specified. More information around getting started can be found in the documentation.

Decode Avro messages from Kafka with the Schema Registry

With Apache Kafka® continuing to serve as enterprise data hubs in various organizations, we’ve continued to keep our Logstash Kafka integrations in lockstep over time. We want to ensure your datasets are readily accessible with the Elastic Stack, whether it lives in Kafka or anywhere else. The Logstash 7.10.0 release continues down this path, introducing the ability to easily consume and decode Avro messages from Kafka with the new Confluent Schema Registry integration. As users may centrally manage Avro schemas with the Schema Registry, the Logstash Kafka input plugin now provides several new options for fetching the schemas required for decoding and processing Avro messages from Kafka topics. It also offers a proxy option in case you have a proxy between Logstash and the Schema Registry. Check out the plugin documentation for additional details around this feature.

input { 
  kafka { 
    schema_registry_url => "http://myschema.foo.com"
    schema_registry_api_key => "${SCHEMA_REGISTRY_API_KEY}"
    schema_registry_api_secret => "${SCHEMA_REGISTRY_API_SECRET}"
    schema_registry_proxy => "${SCHEMA_REGISTRY_PROXY}"
  } 
}

To gain access to this new functionality, please update your Kafka plugins to the latest 10.6.0 version with this command below. This new plugin version will be bundled and shipped by default with the next Logstash minor release. We encourage you to give it a spin and let us know if you have any feedback!

bin/logstash-plugin update logstash-integration-kafka

Try it yourself!

Please download Logstash 7.10.0, try it, and let us know what you think on Twitter (@elastic) or in our forums and community Slack. You can report any bugs or feature requests on the Logstash Github issues page or within the respective plugin repositories.