Logging

오픈 소스 로그 관리

(ELK 스택으로 잘 알려져 있는) Elastic Stack은 세계에서 가장 유명한 오픈소스 로그 플랫폼 입니다. 여기 그 이유가 있습니다.

Experience logging with Elastic. Try It Out

처음 시작하기에 좋은 주제입니다

Elastic Stack은 흔히 쓰이는 데이터 소스들에 대한 매핑 정의 및 대시보드 자동 생성을 지원하기 때문에  즉시 사용 가능한 경험을 제공합니다.
Filebeat과 Winlogbeat을 이용하여 로그 파일들을 수집하고 Elasticsearch에 색인 한 뒤 Kibana로 시각화 하는 데에는 단 몇분이 걸리지 않습니다. (만약에 여러분이 필요로 하는 모듈(Module) 이 보이지 않는다면, 직접 만들어 보거나 커뮤니티의 힘을 활용하세요. 이것이 승리의 오픈소스입니다.)

이후 내용은 건너뛰고 바로 시작하기

모든 것들을 실시간으로 펼쳐 볼 수 있습니다

Elastic Stack의 심장으로 불리우는 Elasticsearch은 계속되는 규모의 확장에도 빠른 응답 시간을 제공합니다. 질의에 대한 빠른 응답을 받으세요. 거품이 생기면, 다시 헹구고, 또 반복하고. 대시보드가 표시되기까지 기다리느라... 너무 많은... 시간을... 낭비하지… 마세요...

Other
search...
0 matched | 0 scanned
0 Unique Request
Elastic
search...
hits
Unique Request

필요로 하는 만큼 확장 가능합니다. 몇십개 에서 부터 수십억개에 이르는 파일을 탐색하세요.

단일 랩탑에서 하는 경험을 수백개의 페타바이트에서도 동일하게 할 수 있습니다. 다시 또 겪게 될 아키텍처링의 두통으로부터 해방되세요. 그리고 중요한 데이터를 바닥에 버려두는것을 염려하지 마세요. 여러분에게 중요하다고 느껴지는 것들 부터 수집하고 색인하는 것으로 시작하면 됩니다.

시작 해 보세요

이제 최신 버전을 설치하고 호빵을 전자레인지에 돌리는 것 보다 빠르게 로그를 수집하고 시각화 해 보세요.
In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory on your Docker Host:

As a user that has read access to /var/lib/docker/containers (usually root) modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.inputs:
     - type: docker
       containers.ids:
         - '*'
       processors:
       - add_docker_metadata: ~

As a user that has read access to /var/lib/docker/containers (usually root) run:

Ctrl + C to Copy
Open browser @
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
From your machine or wherever you run kubectl:
  • Download filebeat-kubernetes.yml
  • Edit filebeat-kubernetes.yml and specify the host for your Elasticsearch server (If you are connecting back to your host from kubernetes running locally then set ELASTICSEARCH_HOST to host.docker.internal):
  - name: ELASTICSEARCH_HOST
    value: host.docker.internal
            
Ctrl + C to Copy
Open browser @
Go to Discover to search your logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

Download Elasticsearch MSI installer, Kibana .zip file, and Winlogbeat .zip file.

Run through the Elasticsearch MSI installer

Extract the contents of Kibana zip file, and in that directory:
Ctrl + C to Copy
Extract the contents of Winlogbeat zip file, and in that directory:
Ctrl + C to Copy
Open browser @
Open dashboard:
"Winlogbeat Dashboard"
What just happened?
Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.
Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:

Edit your filebeat.yml file to point to the logs you wish to collect.

Ctrl + C to Copy
Open browser @
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn’t work for you?

See documentation for how to configure Filebeat to look at other files and directories.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave cluster size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id> and the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Paste in the <password> for the elastic user when prompted

Open browser @
<kibana_url> (login: elastic/<password>)
Open dashboard:
"[Filebeat System] Syslog dashboard"
What just happened?
Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.
Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave cluster size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id> and the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Paste in the <password> for the elastic user when prompted

Open browser @
<kibana_url> (login: elastic/<password>)
Open dashboard:
"[Filebeat Apache2] Access and error logs"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave cluster size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id> and the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Paste in the <password> for the elastic user when prompted

Open browser @
<kibana_url> (login: elastic/<password>)
Open dashboard:
"[Filebeat MySQL] Overview"
What just happened?
Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.
Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave cluster size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id> and the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory inside a Docker container:

As a user that has read access to /var/lib/docker/containers (usually root) modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.inputs:
     - type: docker
       containers.ids:
         - '*'
       processors:
       - add_docker_metadata: ~

As a user that has read access to /var/lib/docker/containers (usually root) run:

Paste in the <password> for the elastic user when prompted

Open browser @
<kibana_url> (login: elastic/<password>)
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave cluster size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id> and the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Paste in the <password> for the elastic user when prompted

From your machine or wherever you run kubectl:

env:
  - name: ELASTIC_CLOUD_ID
    value: <cloud.id>
  - name: ELASTIC_CLOUD_AUTH
    value: <cloud.auth>
				
Open browser @
<kibana_url> (login: elastic/<password>)
Go to Discover to search your logs
What just happened?
Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.
Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave cluster size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id> and the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download Winlogbeat .zip file.

Extract the contents of Winlogbeat zip file, and in that directory:

Paste in the <password> for the elastic user when prompted

Open browser @
http://<kibana_url> (login: elastic/<password>)
Open dashboard:
"Winlogbeat Dashboard"
What just happened?

Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.

Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave cluster size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id> and the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Edit your filebeat.yml to point to the logs you wish to collect.

Paste in the <password> for the elastic user when prompted

Open browser @
<kibana_url> (login: elastic/<password>)
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn't work for you?

See documentation for how to configure Filebeat to look at other files and directories.

이상 징후 탐지를 위해 머신러닝 작업을 가동하세요

모든 로그 메시지 트랜잭션을 들여다 볼 필요는 없습니다. 우리에게 필요한 것은 중요한 의미를 지닌 하나의 키 메시지입니다.

Elastic의 머신러닝 기능은 Elastic Stack의 확장 기능으로 Elasticsearch 데이터의 모델을 자동으로 생성하고 문제가 있는 데이터에 대한 알림을 실시간으로 발생합니다.

저희 이야기를 그대로 받아들이기만 하지 마세요

정보통신업계의 큰손 Sprint사의 시스템 관리자는 그 동안 로그 데이터 수집을 위해 쉘 스크립트와 grep 커맨드를 사용 해 왔습니다. 이제는 Elastic을 사용하여 성능 이슈를 해결하고, 고객의 만족도를 향상시키며, B2B 관계를 재구성하는 등 판매 시스템을 유연하게 운영하고 있습니다.

Elastic으로 로그 데이터를 다루는 곳은 여기 뿐만이 아닙니다. 더욱 많은 또다른 사례들을 확인하세요.

로그를 넘어 또 어떤것들이 있을까요?

메트릭 데이터가 있습니까? 프록시나 방화벽 로그가 있나요? 수많은 텍스트 문서를 처리해야 하나요? 이 모든 것들을 Elastic Stack으로 통합하여 분석을 확장하고, 워크플로우를 간소화하며, 아키텍처를 단순화하세요.

Metrics

CPU, 메모리 및 그 외의 다양한 수치 분석.

더 보기

사이트 검색

손쉽게 만들고 추가하는 사이트 검색의 경험.

더 보기

보안 분석

빠르고 확장 가능한 상호 교환형 정보 수색.

더 보기

APM

애플리케이션 성능에 대한 통찰력 강화.

더 보기

앱 검색

문서, 위치정보 및 그 이상의 검색.

더 보기