Security Analytics

Security Analytics

Bedrohungen folgen keinem bestimmten Muster. Und daran solltest du dich auch halten. Der Elastic Stack verschafft dir den notwendigen Vorsprung, damit du mit den Angriffsmethoden von heute und morgen mithalten kannst. So funktioniert es:

Entdecke Security Analytics mit Elastic. Probier es aus.

Neu Ein genauer Einblick dazu, wie die X-Pack Features dir auf dem Weg zur Erfüllung der Anforderungen des DSGVO helfen können. Mehr erfahren.

Eine schnelle Reaktion ist das A und O

Bei Angriffen stellt sich nicht die Frage, ob sie stattfinden, sondern wann. Deshalb solltest du dir überlegen, wie lange du einen Angreifer wirklich in deinem System haben möchtest.

Elastic ist auf Geschwindigkeit ausgelegt und indiziert deine Daten, sobald sie verarbeitet werden. Das reduziert die Zeit bis zur Entdeckung auf Sekunden und erleichtert die Durchführung von Ad-hoc-Queries sowie die Erstellung von Echtzeit-Visualisierungen.

Sammle und analysiere alle Daten

Der Schlüssel zur Erkenntnis einer Bedrohung kann überall liegen. Deshalb ist es auch so wichtig, dass du ein umfassendes Bild davon hast, was in Echtzeit auf all deinen Systemen passiert.

Elasticsearch verarbeitet problemlos Petabyte an Daten – von Firewalls über Web-Proxys bis hin zu Intrusion Detection Systems und anderen beliebigen Quellen. Worauf wartest du noch?

Other
search...
0 matched | 0 scanned
0 Unique Request
Elastic
search...
hits
Unique Request

Halte Daten länger verfügbar, um sie zu untersuchen

Wann sind sie reingekommen? Wohin wurden sie weitergeleitet? Was haben sie gemacht? Was ist noch gefährdet?

Um diese Fragen zu beantworten, reicht ein siebentägiger Verlauf meistens nicht aus. Durchschnittliche Bedrohungen können 100 Tage inaktiv schlummern, bevor sie erkannt werden. Mit Elastic kannst du langfristige historische Daten schnell, praktisch und einfach durchsuchen.

Entwickle etwas Neues und optimiere dein SIEM

Beginne mit einem unbeschriebenen Blatt und entwickle deine eigene Sicherheitslösung, wie beispielsweise Slack es gemacht hat, oder erweitere eine vorhandene SIEM-Investition, so wie USAA. Elastic ist flexibel! Wenn du nicht direkt das findest, was du suchst, kannst du es einfach selbst entwickeln oder die Community nutzen. Du bist nicht eingeschränkt. Das ist Open Source in ihrer ganzen Pracht.

Jetzt ausprobieren

Klein starten. Nach und nach ausbauen. Ganz wie du willst. Mit einer Neuinstallation kannst dich selbst überzeugen, was alles möglich ist.
  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Filebeat

Open terminal (varies depending on your client OS) and in the Metricbeat install directory, type:

Paste in the <password> for the elastic user when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"[Filebeat System] SSH login attempts" or "[Filebeat System] Sudo commands"
What just happened?
Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system log messages, and reporting on SSH login attempts and other authentication events.
Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Auditbeat

Open terminal (varies depending on your client OS) and in the Auditbeat install directory, type:

Paste in the <password> for the elastic user when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"[Auditbeat File] File Integrity"
What just happened?
Auditbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system audit information.
Didn't work for you?

Auditbeat module assumes default operating system configuration. See the documentation for more details.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Logstash

Open terminal (varies depending on your client OS) and in the Logstash install directory, type:

Modify logstash.yml to set ArcSight module details

modules:
- name: arcsight
  var.inputs: smartconnector
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<password>"
				

Configure Smart Connectors to send CEF events to Logstash via TCP on default port 5000.

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"[ArcSight] Network Overview Dashboard"
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing Arcsight events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the ArcSight solution, however you can override defaults. See the documentation for more details.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Packetbeat

Open terminal (varies depending on your client OS) and in the Packetbeat install directory, type:

Paste in the <password> for the elastic user when prompted

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"[Packetbeat] DNS Tunneling"
What just happened?

Packetbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing details of your DNS traffic.

Didn't work for you?

Packetbeat makes a set of assumptions around defaults, such as default network ports. See the documentation for more details on how to further configure your deployment.

  • Register, if you do not already have an account. Free 14-day trial available.
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Deployment, and specify the Deployment Name
  • Modify the other deployment options as needed (or not, the defaults are great to get started)
  • Click Create Deployment
  • Save the Cloud ID and the cluster Password for your records, we will refer to these as <cloud.id> and <password> below
  • Wait until deployment creation completes

Download and unpack Logstash

Open terminal (varies depending on your client OS) and in the Logstash install directory, type:

Paste in the <password> for the elastic user when prompted

Modify logstash.yml to set Netflow module details

cloud.id: <cloud.id>
     cloud.auth: elastic:${ES_PWD}
     modules:
       - name: netflow
         var.input.udp.port: <netflow_port>
				

Configure NetFlow to export flow events to Logstash via UDP on default port 2055.

Open Kibana from Kibana section of the Elastic Cloud console (login: elastic/<password>)
Open dashboard:
"Netflow: Overview"
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing Netflow events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the Netflow solution, however you can override defaults. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Filebeat install directory:
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system log messages, and reporting on SSH login attempts and other authentication events.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Auditbeat install directory:
Ctrl + C to Copy
What just happened?
Auditbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system audit information.
Didn't work for you?

Auditbeat module assumes default operating system configuration. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Logstash install directory:

Modify logstash.yml to set ArcSight module details

modules:
       - name: arcsight
         var.inputs: smartconnector
Ctrl + C to Copy

Configure Smart Connectors to send CEF events to Logstash via TCP on default port 5000.

What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing ArcSight events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the ArcSight solution, however you can override defaults. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Packetbeat install directory:
Ctrl + C to Copy
What just happened?

Packetbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing details of your DNS traffic.

Didn't work for you?

Packetbeat makes a set of assumptions around defaults, such as default network ports. See the documentation for more details on how to further configure your deployment.

In Elasticsearch install directory:
Ctrl + C to Copy
In Kibana install directory:
Ctrl + C to Copy
In Logstash install directory:

Modify logstash.yml to set Netflow module details

modules:
       - name: netflow
         var.input.udp.port: <netflow_port>
Ctrl + C to Copy

Configure NetFlow to export flow events to Logstash via UDP on default port 2055.

What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing Netflow events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the Netflow solution, however you can override defaults. See the documentation for more details.

Automatisierte Erkennung von Auffälligkeiten und verdächtigen Verbindungen

Wie schafft man es mit Milliarden von Signaturen mitzukommen? Oder bedeutende Verbindungen aus Millionen von IP-Adressen zu finden? Setze in deinem Elastic-System auf Machine Learning und Graph-Analysen, um in der Masse an Daten schnell – erwartete und unerwartete – Cyber-Bedrohungen zu erkennen.

Du bist in guter Gesellschaft

USAA startete mit ein paar Elasticsearch-Nodes in seinem Sicherheits-Lab. Heute ist dort ein komplettes System in der Produktionsumgebung im Einsatz, das als Erweiterung des ArcSight SIEM dient. Früher mussten USAA-Mitarbeiter minutenlang (oder sogar mehrere Stunden) warten, bevor Queries an Log-Management-Appliances Ergebnisse für die Bedrohungsanalyse ausgegeben haben. Mit Elastic gehören diese Zeiten der Vergangenheit an.

Das sind nicht die Einzigen, die Sicherheitsereignisse mit Elastic verwalten. Hier findest du weitere Kundenbeispiele.

Security Analytics umfasst mehr als nur Sicherheitsereignisse

Hast du Kennzahlen? Infrastruktur-Logs? Dokumente mit einer Unmenge an Text? Zentralisiere alle diese Daten im Elastic Stack zusammen mit deinen Sicherheitsereignissen – für umfassendere Analysen, minimierte Risiken und eine vereinfachte Architektur.

Logging

Schnelles und skalierbares Logging – ohne Unterbrechung.

Mehr erfahren

Metriken

Erhalte Einblick in Zahlen: CPU, Speicher, usw.

Mehr erfahren

Website-Suche

Optimiere das Sucherlebnis auf deiner Website.

Mehr erfahren

APM

Erhalte Einblicke in deine Application-Performance.

Mehr erfahren

App-Suche

Suche nach Dokumenten, Geodaten usw.

Mehr erfahren