View web crawler events logsedit

App Search stores its web crawler logs in Elasticsearch. Kibana provides two user interfaces to view these logs: Discover and Logs.

For a complete reference of all events, see Web crawler events logs reference.

View web crawler events logs in Discoveredit

To view crawler logs in Discover, you must first configure a Kibana data view. If no data views have been created, Kibana will automatically direct you to the appropriate page. Otherwise, navigate to Stack ManagementData Views via the left sidebar.

Crawler events are logged within the enterprise-search set of system indices, so Kibana will not recognize available data by default.

  1. Select Create data view.
  2. On the Create data view page, select Show advanced settings. Set the data view name to logs-crawler-default, then continue on to settings configuration.
  3. From settings, select @timestamp as the primary time field, and create the data view.
  4. Navigate back to the Discover UI, select your newly created data view, and you are ready to begin exploring your crawl events in detail!

You will likely want to set up some custom columns in Discover to more easily discern crawl events at a glance.

To add a new column, select an entry from the list of available fields on the left sidebar. Handy columns for crawler events include crawler.crawl.id, url.domain, url.path, event.action, and http.response.status_code.

View web crawler events logs in Logsedit

To view crawler logs in Logs, you must first configure a source configuration.

  1. Navigate to Logs UI via the left sidebar, under Observability.
  2. Under Settings, specify logs-crawler-default as the Log indices data view.
  3. Configure any desired custom columns. A nice place to start is by removing the preset event.dataset and message options and adding crawler.crawl.id, url.domain,url.path, event.action, and http.response.status_code.

    Autocomplete will be available on these columns after the initial source configuration has been applied. You can edit these settings at any time.

  4. Apply the configuration, navigate to Stream, and you are ready to begin exploring your crawl events in detail!