Log correlationedit

Log correlation allows you to navigate to all logs belonging to a particular trace and vice-versa: for a specific log, see in which context it has been logged and which parameters the user provided.

Step 1: Ingest your application logs into Elasticsearchedit

To get started with log correlation, your logs have to be ingested into Elasticsearch first. The easiest way to get started with that is Java ECS logging which provides layouts/encoders for Logback, log4j2, and log4j. These layouts/encoders convert a log event into an ECS-compatible JSON structure. Using minimal Filebeat configuration, you can ingest these logs into Elasticsearch.

Head over to the documentation to learn more: Java ECS logging documentation.

If you’re using plain-text logs it’s recommended to parse out specific parts of the logs, like the timestamp, the log level, and the message. One way to do this is by using an ingest pipeline with a grok processor.

Step 2: Enable log correlation in the agentedit

In order to correlate logs from your application with transactions captured by the Elastic APM Java Agent, the agent will inject the following IDs in your MDC when enable_log_correlation is enabled:

Step 3: Extract the ID fieldsedit

If you are using Java ECS logging, there’s nothing to do in this step. The IDs in the MDC are automatically added to the right fields.

For plain text logs, use the pattern layout of your logging implementation to write the MDC values to your log file. If you are using Logback or log4j, add %X to the format to log all MDC values or %X{trace.id} to only log the trace id. With the help of Filebeat and Logstash or an Elasticsearch ingest pipeline, you can parse out the IDs from the logs and put them in the transaction.id and the trace.id fields.