Logging

开源日志管理

Elastic Stack(旧称 ELK Stack)是最受欢迎的开源日志平台。这里将介绍它受欢迎的原因。

体验 Elastic 的日志分析能力。 试试看

Azure 监测模块让您收获更加全面的日志解决方案。了解详情

容易入门

Elastic Stack 针对各种常见数据源提供开箱即用的支持,并有各种默认仪表板可供启用,就是这么好用。利用 Filebeat 和 Winlogbeat 输送日志,索引至 Elasticsearch,然后用几分钟时间在 Kibana 中全部可视化。直接试用。 (如果没看到所需模块,可以自己构建,或者借助社区的力量。开源以制胜。)

您可以实时看到一切信息

Elastic Stack 以 Elasticsearch 作为核心,响应速度快,大规模也不例外。提出问题,即可快速解答。提问、回答、重复。无需苦苦等待……仪表板……缓慢加载……

Elastic
search...
hits
Unique Request
Other
search...
0 matched | 0 scanned
0 Unique Request

它会随您的需求调整规模。可加入几个文档,亦或数十亿个。

无论您有一台电脑,还是有几百台拥有千万亿字节数据的电脑,体验都一样。您可以省去重新架构的麻烦。

不用担心会遗漏宝贵数据。安心获取并索引对您以及您需要做出的决策而言重要的数据。

亲自试用

全新安装,快速传输和可视化日志,分分钟即可完成。
  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Filebeat

Open terminal (varies depending on your client OS) and in the Filebeat install directory, type:

Paste in the <password> for the elastic user when prompted

Paste in the <cloud.id> for the cluster when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"[Filebeat System] Syslog dashboard"
What just happened?
Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.
Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Filebeat

Open terminal (varies depending on your client OS) and in the Filebeat install directory, type:

Paste in the <password> for the elastic user when prompted

Paste in the <cloud.id> for the cluster when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"[Filebeat Apache2] Access and error logs"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Filebeat

Open terminal (varies depending on your client OS) and in the Filebeat install directory, type:

Paste in the <password> for the elastic user when prompted

Paste in the <cloud.id> for the cluster when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"[Filebeat MySQL] Overview"
What just happened?
Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.
Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Filebeat

Open terminal (varies depending on your client OS) and in the Filebeat install directory inside a Docker container, type:

As a user that has read access to /var/lib/docker/containers (usually root) modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.inputs:
     - type: docker
       containers.ids:
         - '*'
       processors:
       - add_docker_metadata: ~
				

As a user that has read access to /var/lib/docker/containers (usually root) run:

Paste in the <password> for the elastic user when prompted

Paste in the <cloud.id> for the cluster when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Filebeat

Open terminal (varies depending on your client OS) and in the Filebeat install directory, type:

Paste in the <password> for the elastic user when prompted

Paste in the <cloud.id> for the cluster when prompted

From your machine or wherever you run kubectl:

env:
  - name: ELASTIC_CLOUD_ID
    value: <cloud.id>
  - name: ELASTIC_CLOUD_AUTH
    value: <cloud.auth>
				
Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Go to Discover to search your logs
What just happened?
Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.
Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download Winlogbeat .zip file.

Extract the contents of Winlogbeat zip file, and in that directory:

Paste in the <password> for the elastic user when prompted

Paste in the <cloud.id> for the cluster when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"Winlogbeat Dashboard"
What just happened?

Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.

Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Filebeat

Open terminal (varies depending on your client OS) and in the Filebeat install directory, type:

Edit your filebeat.yml to point to the logs you wish to collect.

Paste in the <password> for the elastic user when prompted

Paste in the <cloud.id> for the cluster when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn't work for you?

See documentation for how to configure Filebeat to look at other files and directories.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory on your Docker Host:

As a user that has read access to /var/lib/docker/containers (usually root) modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.inputs:
     - type: docker
       containers.ids:
         - '*'
       processors:
       - add_docker_metadata: ~
			

As a user that has read access to /var/lib/docker/containers (usually root) run:

Ctrl + C to Copy
Open browser @
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
From your machine or wherever you run kubectl:
  • Download filebeat-kubernetes.yml
  • Edit filebeat-kubernetes.yml and specify the host for your Elasticsearch server (If you are connecting back to your host from kubernetes running locally then set ELASTICSEARCH_HOST to host.docker.internal):
  - name: ELASTICSEARCH_HOST
    value: host.docker.internal
			
Ctrl + C to Copy
Open browser @
Go to Discover to search your logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

Download Elasticsearch MSI installer, Kibana .zip file, and Winlogbeat .zip file.

Run through the Elasticsearch MSI installer

Extract the contents of Kibana zip file, and in that directory:
Ctrl + C to Copy
Extract the contents of Winlogbeat zip file, and in that directory:
Ctrl + C to Copy
Open browser @
Open dashboard:
"Winlogbeat Dashboard"
What just happened?
Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.
Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:

Edit your filebeat.yml file to point to the logs you wish to collect.

Ctrl + C to Copy
Open browser @
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn’t work for you?

See documentation for how to configure Filebeat to look at other files and directories.

将机器学习添加到自动异常检测

您不应该对每条日志消息或每项事务都面面俱到 - 只需要关注那些重要的或值得注意的。

Elastic 的机器学习功能会扩展 Elastic Stack 的功能,自动模拟您的 Elasticsearch 数据的行为,并实时地提醒您出现的问题。

我们并非口说无凭

电信巨头 Sprint 公司的系统管理员过去需要梳理日志、运行 shell 脚本、查找他们已知的数据。现在,他们使用 Elastic 快速解决性能问题、提高客户满意度、简化 B2B 关系、精简零售系统。

他们只是使用 Elastic 进行日志管理的众多公司之一。探索更多客户示例

从日志管理中释放出更多精力

有指标吗?代理还是防火墙日志?亦或是包含大量文本的文档?将所有这些集中进 Elastic Stack,进而充实分析、降低运营成本、简化架构。

指标分析

数字统计:CPU、内存等。

了解详情

网站搜索

为您的站点轻松创建良好的搜索体验。

了解详情

安全分析

快速且规模化的交互式调查。

了解详情

APM

深入了解应用程序的性能。

了解详情

应用搜索

搜索文档、地理数据等。

了解详情