Logging

Elastic Stack(ELK Stackとしても知られています)は、最も人気のあるオープンソースのログ管理のプラットフォームです。その理由をご紹介しましょう。

Elastic Stackで楽々ログ管理を体験してみる。 今すぐトライ

簡単に始められる

一般的なデータソースのために、すぐに使えるDashboardを用意しているElastic Stackは、従来の常識をくつがえすものです。FilebeatとWinlogbeatでログを送信してElasticsearchにインデックス、数分でKibanaで可視化することができます。スキップしていますぐインストール。(必要なModuleが無い場合には、自身で作るか、コミュニティを活用することができます。これがオープンソースの良いところです。)

Elastic Stackの中心であるElasticsearchのおかげで、大規模な環境であっても、素早いレスポンスが期待できます。質問には迅速に答えが返ってきます。ぐるぐる回る砂時計…Dashboardが表示されるのを、ひたすら待つ必要はありません。

Other
search...
0 matched | 0 scanned
0 Unique Request
Elastic
search...
hits
Unique Request

ノートパソコン1台でも、ペダバイトのデータを有する数百台と同じ体験ができます。頭痛のタネであった再設計は不要です。

貴重なデータを目の届かない場所に置くなんて、と心配しないでください。大切なこと、決定すべきことを、すべて取り込んで、インデックスします。

お試しください

最新版のElastic Stackを準備しましょう。データの送信と、ログの可視化はお弁当をチンするよりも早くできます。
In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory inside a Docker container:

Modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.prospectors:
  - type: log
  paths:
    - '/var/lib/docker/containers/*/*.log'
  json.message_key: log
  json.keys_under_root: true
  processors:
  - add_docker_metadata: ~
output.elasticsearch:
  hosts: ["<elasticsearch_url>:9200"]
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
From your machine or wherever you run kubectl:
  • Download filebeat-kubernetes.yml
  • Edit filebeat-kubernetes.yml to point to your Elasticsearch instance with credentials
env:
  - name: ELASTICSEARCH_USERNAME
    value: elastic
  - name: ELASTICSEARCH_PASSWORD
    value: changeme
            
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Go to Discover to search your logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

Download Elasticsearch MSI installer, Kibana.zip file, and Winlogbeat .zip file.

Run through the Elasticsearch MSI installer (leave X-Pack checked)

In Elasticsearch install directory:
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

Extract the contents of Kibana zip file, and in that directory:
Ctrl + C to Copy

Modify kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
Extract the contents of Winlogbeat zip file, and in that directory:

Modify winlogbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
"Winlogbeat Dashboard"
What just happened?

Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.

Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Open browser @
http://<kibana_url>:5601 (login: elastic/<es_pw>)
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn’t work for you?

See documentation for how to configure Filebeat to look at other files and directories.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat System] Syslog dashboard"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
            
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat Apache2] Access and error logs"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat MySQL] Overview"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.prospectors:
  - type: log
  paths:
    - '/var/lib/docker/containers/*/*.log'
  json.message_key: log
  json.keys_under_root: true
  processors:
  - add_docker_metadata: ~
output.elasticsearch:
  hosts: ["<elasticsearch_url>:9200"]
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                

From your machine or wherever you run kubectl:

env:
  - name: ELASTICSEARCH_USERNAME
    value: elastic
  - name: ELASTICSEARCH_PASSWORD
    value: changeme
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Go to Discover to search your logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download Winlogbeat .zip file.

Extract the contents of Winlogbeat zip file, and in that directory:

Modify winlogbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"Winlogbeat Dashboard"
What just happened?

Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.

Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn't work for you?

See documentation for how to configure Filebeat to look at other files and directories.

すべてのログメッセージやトランザクションに目をとおす必要はありません。重要なもの、または注目に値するものに絞ります。

Elasticsearchに格納されたデータの挙動から自動的にモデルを作成し、リアルタイムで問題を報告するところです。

大手通信会社のSprintでは、かつてシステム管理者がログを細かくチェック、シェルスクリプトを実行、grepなどしていました。今ではElasticを使用し、パフォーマンスの問題のトラブルシューティングを迅速に行い、顧客満足度を高め、企業間取引関係を簡略化し、小売りシステムを合理化させています。

Elasticをログ管理に利用しているのはこれだけではありません。 他のユーザー事例も併せてご覧ください。

ログ管理を超えて

メトリックも、インフラのログも、膨大なドキュメントも、Elastic Stackで組み合わせれば、分析の幅が拡がり、運用コストを低下させて、アーキテクチャをシンプルにすることができます。

メトリックス

CPUやメモリなど、幅広いデータを収集し、分析。

詳細

Swiftype Site Search

ウェブサイトの検索機能を大幅に向上。

詳細

セキュリティ分析

高速かつ大規模な、リアルタイムの異常検知。

詳細

APM

パフォーマンスメトリックを収集し、分析。

詳細

アプリ検索

ドキュメント、地理データなどを検索。

詳細