Welcome back to The Logstash Lines! In these weekly posts, we'll share the latest happenings in the world of Logstash and its ecosystem.
Ever wanted to use Logstash Forwarder with Docker? Here is a two-part article series explaining how to do just this:
- Part 1: Creating the Docker image: http://boynux.com/logstash-forwarder-docker/
- Part 2: Using the Docker image: http://boynux.com/logstash-forwader-docker-part-2/
And from our very own Tal Levy, here is part one of a three-part blog series on Kafka and Logstash integration: https://www.elastic.co/blog/logstash-kafka-intro
Logstash Core and Plugins
- We created a new http input plugin that allows external data sources to push data to Logstash over HTTP. For example, a continuous integration tool such as Jenkins could push build failure notifications over HTTP to Logstash via this plugin. Not only could Logstash parse and index these notifications into Elasticsearch but also — using conditional logic — escalate critical build failures to developers using the PagerDuty output plugin. This new plugin makes integrating Logstash with Watcher's webhook action a breeze!
- We continue to iterate on supporting offline plugin installation. Look for this feature to be included in Logstash version 1.6.
- Recently we started work on the http_poller input plugin that polls a specified HTTP web service and injects its response into the Logstash pipeline. This plugin could be used, for instance, to index data from services that expose metrics or metadata about an underlying system. We evaluated community contributions for this plugin and are currently in the process of adding security features, tests, and documentation.
- Decided to rewrite the http output plugin with Manticore, a JRuby-only HTTP client library. We already use Manticore in the Elasticsearch output plugin so we have decided to standardize on it.
That's this week in The Logstash Lines. Come back next week for more Logstash news!