Security Analytics

보안 분석

위협에 공식은 없습니다. 사용자도 그래서는 안 됩니다. Elastic Stack은 오늘과 내일의 공격 매체에 긴밀하게 대응하기 위한 도구를 제공합니다. 여기 그 방법들을 소개합니다.

Experience security analytics with Elastic. Try It Out

New A detailed look at how X-Pack security features can accelerate your path to GDPR compliance. Read More

일단은, 무조건 빨라야 합니다

공격은 발생 가능성이 아닌 시기의 문제입니다. 과연 적들이 시스템 안에 얼마 동안 상주하는 것을 견딜 수 있을지 스스로에게 질문을 던져보세요.

Elastic은 속도를 중점으로 설계되었고, 데이터를 처리함과 동시에 색인합니다. 이는 정보가 제공되는 시간을 초 단위로 줄여주며, 실시간 쿼리 실행 및 시각화를 가능하게 합니다.

데이터를 버려두지 말고, 한곳에 모으세요.

위협 감지의 핵심은 어디서나 발생할 수 있다는 점입니다. 따라서, 시스템에서 어떤 일이 일어나는지 전체적인 상황을 실시간으로 파악하는 것이 중요합니다.

Elasticsearch는 방화벽, 프록시, 감지 시스템 및 어떤 유형이던지 간에 페타바이트급 규모의 데이터를 모두 소화시킬 수 있습니다. 망설이지 마세요.

Other
search...
0 matched | 0 scanned
0 Unique Request
Elastic
search...
hits
Unique Request

조사를 위해 데이터를 온라인에 장기간 보관하세요

언제 침투했을까요? 어디로 물러났을까요? 대체 뭘 하고 간걸까요? 망가진 부분은 어디인가요?

이런 질문들에 대답하기 위해서는 고작 일주일 데이터를 돌아보는 것으로 충분하지 않을 수 있습니다. 보통의 위협들은 회복되기까지 100일이 넘는 잠복기를 가질 수도 있습니다. Elastic은 장기간의 과거 데이터 검색을 통해 이것들을 가능하게 하기만 하는 것이 아니라, 실용적이고, 쉽고, 빠름을 더합니다.

새로운 것을 만들어 나만의 SIEM으로 발전시키세요

Slack과 같이 아무것도 없는 백지상태로부터 시작해 스스로 보안 솔루션을 재배하여 만들거나, USAA와 같이 기존에 사용하고 있는 SIEM 도구를 더욱 강력하게 만들어 보세요. Elastic은 충분히 유연합니다. 만약에 필요로 하는 것이 보이지 않는다면, 스스로 만들거나 커뮤니티의 힘을 활용하세요. 이것이 바로 승리의 오픈소스 입니다.

시작 해 보세요

작게 시작하여 크게 나아가세요. 모든 것은 여러분의 선택에 달려있습니다. 최신 버전을 설치하고, 탐색하고, 새로운 것들을 발견 해 나아가세요.
In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system log messages, and reporting on SSH login attempts and other authentication events.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Auditbeat install directory:

Modify auditbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
What just happened?

Auditbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system audit information.

Didn't work for you?

Auditbeat module assumes default operating system configuration. See the documentation for more details.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Logstash install directory, run:

Modify logstash.yml to set NetFlow module details

modules:
- name: netflow
  var.input.udp.port: <netflow_port>
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<es_pw>"
			

Configure NetFlow to export flow events to Logstash via UDP on default port 2055.

Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing Netflow events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the NetFlow solution, however you can override defaults. See the documentation for more details.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
In Packetbeat install directory:

Modify packetbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
What just happened?

Packetbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing details of your DNS traffic.

Didn't work for you?

Packetbeat makes a set of assumptions around defaults, such as default network ports. See the documentation for more details on how to further configure your deployment.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Logstash install directory, run:

Modify logstash.yml to set ArcSight module details

modules:
- name: arcsight
  var.inputs: smartconnector
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<es_pw>"
			

Configure Smart Connectors to send CEF events to Logstash via TCP on default port 5000.

What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing ArcSight events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the ArcSight solution, however you can override defaults. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat System] SSH login attempts" or "[Filebeat System] Sudo commands"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system log messages, and reporting on SSH login attempts and other authentication events.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Auditbeat

In Auditbeat install directory:

Modify auditbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Auditbeat File] File Integrity"
What just happened?

Auditbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system audit information.

Didn't work for you?

Auditbeat module assumes default operating system configuration. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Logstash

In Logstash install directory:

Modify logstash.yml to set Netflow module details

modules:
- name: netflow
  var.input.udp.port: <netflow_port>
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<password>"
				

Configure NetFlow to export flow events to Logstash via UDP on default port 2055.

Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"Netflow: Overview"
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing NetFlow events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the Netflow solution, however you can override defaults. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Packetbeat

In Packetbeat install directory:

Modify packetbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Packetbeat] DNS Tunneling"
What just happened?

Packetbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing details of your DNS traffic.

Didn't work for you?

Packetbeat makes a set of assumptions around defaults, such as default network ports. See the documentation for more details on how to further configure your deployment.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Logstash

In Logstash install directory, run:

Modify logstash.yml to set ArcSight module details

modules:
- name: arcsight
  var.inputs: smartconnector
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<password>"
                

Configure Smart Connectors to send CEF events to Logstash via TCP on default port 5000.

Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[ArcSight] Network Overview Dashboard"
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing Arcsight events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the ArcSight solution, however you can override defaults. See the documentation for more details.

이상 징후를 자동화하고, 의심되는 접속을 살펴보세요

수십억 건의 징후를 실시간으로 파악하려면 어떻게 해야 할까요? 아니면 수백만개의 IP 주소들 간의 연결 가운데서 의미를 드러낼 수 있을까요? Elastic의 머신러닝과 Graph 분석 기능을 추가하여 수많은 잡음들 사이에서 예상했던, 그리고 예상하지 못했던 사이버 위협을 빠르게 탐지하세요.

여러분의 좋은 동반자가 되겠습니다

USAA는 자신들의 보안 랩에서 몇 개의 Elasticsearch 노드로 개발을 시작하였습니다. 지금은 ArcSight SIEM의 확장 제품으로 광범위하게 개발을 하고 있습니다. 이전에 USAA의 로그 관리 시스템은 위협 분석을 위한 쿼리 응답에 수 분 (에서 수 시간 까지)의 기다림을 필요로 했습니다. Elastic을 사용하면서 부터 그럴 필요가 없어졌습니다.

Elastic을 보안 시스템에 사용한 사례는 이것 뿐만이 아닙니다. 더욱 많은 또다른 사례를 확인하세요.

보안 분석은 보안 로그 데이터에만 있는 것이 아닙니다

메트릭 데이터가 있습니까? 인프라 로그를 사용중이신가요? 수많은 텍스트 문서를 처리해야 하나요? 이 모든 것들을 Elastic Stack으로 통합하여 분석을 확장하고, 워크플로우를 간소화하며, 아키텍처를 단순화하세요.

로그분석

빠르고 확장 가능하며 중단되지 않는 실시간 로깅.

더 보기

Metrics

CPU, 메모리 및 그 외의 다양한 수치 분석.

더 보기

사이트 검색

손쉽게 만들고 추가하는 사이트 검색의 경험.

더 보기

APM

애플리케이션 성능에 대한 통찰력 강화.

더 보기

앱 검색

문서, 위치정보 및 그 이상의 검색.

더 보기