View and query logsedit

You can use two Kibana tools to view and query logs which have been ingested into Elasticsearch.

  • Logs UI is a specialized tool for exploring logs. It enables you to search, filter, and tail all the logs you have ingested into Elasticsearch.
  • Discover is a tool that focuses on rapid consumption of Elasticsearch data, including logs, with a standardized query language.

Use data views to view and query logs within Logs UI or Discover. A data view tells Kibana where to find your Elasticsearch data. A data view can point to a specific index, for example, your log data from yesterday, or all indices that contain your log data.

Discover requires you create a data view before viewing logs. Logs UI can also integrate with data views to configure your indices, because data views are shared among apps in the Kibana space.

This guide shows you how to create data views and get started viewing your logs in Kibana:

Create a data viewedit

Follow these steps to create a data view:

  • Navigate to Stack ManagementData Views and create a data view by entering an index pattern that matches one or more data sources.

    For example, logs-crawler-default matches the web crawler data stream and logs-enterprise_search.api-default matches the Enterprise Search API data stream.

  • Under Show advanced settings you can allow hidden and system indices.
  • Select a timestamp field for use with the global time filter.

Your data view is now available for use with Logs UI or Discover.

View and query logs using Logs UIedit

  • In the deployment where your logs are stored, open Kibana. Navigate to the Logs UI, under Observability.
  • In LogsSettings you have two source configuration options:

    • Use Log indices with index patterns to match indices that contain log data.

      Each log source integrates with Kibana index patterns, which support creating and querying runtime fields. You can continue to use log sources configured to use an index name pattern, such as filebeat-*, instead of a Kibana index pattern. However, some features— like those that depend on runtime fields— may not be available.

      or

    • Use Data views by selecting a log data view from the dropdown menu.

      The default log columns Timestamp, Field, and Message are already configured. Customize columns with Add columns.

  • Within Logs UI, the Stream page enables you to monitor all log events in a centralized view. Use the search bar to find log entries.
  • Use Stream Live to view a continuous flow of log messages in real time, or click Stop streaming to view historical logs from a specified time range.
  • View the details of a log line, by clicking on View details, or view it in context by clicking "View in context".

Read the Observability guide to Log monitoring to learn more about how to tail log files, categorize log entries, inspect log anomalies, and configure data sources.

View and query logs using Discoveredit

  • Create a data view, to make your logs visible in Discover.
  • In the deployment where your logs are stored, open Kibana. In the Analytics sidebar navigate to Discover.
  • Select the data view you created, and you are ready to explore these logs in detail.
  • Kibana Query Language (KQL) is the default syntax option for queries in the Discover search bar. If you turn off KQL, Discover uses Lucene query syntax.
  • Narrow down your queries by adjusting time and date ranges as required.
  • Search for field names and filter by type.
  • Customize and save your searches for future use, or put them in a dashboard.
  • Set up custom columns to ensure your events of interest are available at a glance. To add a new column, select an entry from the list of available fields on the left sidebar.

    Autocomplete will be available on these columns after the initial source configuration has been applied. You can edit these settings at any time.

To learn more, read a detailed guide to Kibana Discover.

Filter your log queries using the event.dataset field, when viewing multiple indices or data streams, to narrow your results down to a relevant subset. For example, enterprise_search.crawler will show results from the App Search web crawler logs.