Alerting Troubleshootingedit

This page describes how to resolve common problems you might encounter with Alerting. If your problem isn’t described here, please review open issues in the following GitHub repositories:

Have a question? Contact us in the discuss forum.

Alerts with small check intervals run lateedit

Problem:

Alerts with a small check interval, such as every two seconds, run later than scheduled.

Resolution:

Alerts run as background tasks at a cadence defined by their check interval. When an Alert check interval is smaller than the Task Manager poll_interval the alert will run late.

Either tweak the Kibana Task Manager settings or increase the check interval of the alerts in question.

For more details, see Tasks with small schedule intervals run late.

Alerts run lateedit

Problem:

Scheduled alerts run at an inconsistent cadence, often running late.

Actions run long after the status of an alert changes, sending a notification of the change too late.

Solution:

Alerts and actions run as background tasks by each Kibana instance at a default rate of ten tasks every three seconds.

If many alerts or actions are scheduled to run at the same time, pending tasks will queue in Elasticsearch. Each Kibana instance then polls for pending tasks at a rate of up to ten tasks at a time, at three second intervals. Because alerts and actions are backed by tasks, it is possible for pending tasks in the queue to exceed this capacity and run late.

For details on diagnosing the underlying causes of such delays, see Tasks run late.

Alerting and action tasks are identified by their type.

  • Alert tasks always begin with alerting:. For example, the alerting:.index-threshold tasks back the index threshold stack alert.
  • Action tasks always begin with actions:. For example, the actions:.index tasks back the index action.

When diagnosing issues related to Alerting, focus on the thats that begin with alerting: and actions:.

For more details on monitoring and diagnosing task execution in Task Manager, see Health monitoring.