ECS Logging with Pinoedit

This Node.js package provides a formatter for the pino logger, compatible with Elastic Common Schema (ECS) logging. In combination with the Filebeat shipper, you can monitor all your logs in one place in the Elastic Stack.

Setupedit

Step 1: Installedit

$ npm install @elastic/ecs-pino-format

Step 2: Configureedit

const ecsFormat = require('@elastic/ecs-pino-format')
const pino = require('pino')

const log = pino(ecsFormat()) 
log.info('hi')
log.error({ err: new Error('boom') }, 'oops there is a problem')
// ...

This will configure Pino’s formatters, messageKey and timestamp options.

See usage discussion and examples below.

Step 3: Configure Filebeatedit

The best way to collect the logs once they are ECS-formatted is with Filebeat:

  1. Follow the Filebeat quick start
  2. Add the following configuration to your filebeat.yaml file.

filebeat.yaml.

filebeat.inputs:
- type: log
  paths: /path/to/logs.json
  json.keys_under_root: true
  json.overwrite_keys: true
  json.add_error_key: true
  json.expand_keys: true

For more information, see the Filebeat reference.

Usageedit

const ecsFormat = require('@elastic/ecs-pino-format')
const pino = require('pino')

const log = pino(ecsFormat()) 
log.info('Hello world')

const child = log.child({ module: 'foo' })
child.warn('From child')

Running this will produce log output similar to the following:

{"log.level":"info","@timestamp":"2021-01-19T22:51:12.142Z","ecs":{"version":"1.6.0"},"process":{"pid":82240},"host":{"hostname":"pink.local"},"message":"Hello world"}
{"log.level":"warn","@timestamp":"2021-01-19T22:51:12.143Z","ecs":{"version":"1.6.0"},"process":{"pid":82240},"host":{"hostname":"pink.local"},"module":"foo","message":"From child"}

Error Loggingedit

By default, the formatter will convert an err field that is an Error instance to ECS Error fields. For example:

const ecsFormat = require('@elastic/ecs-pino-format')
const pino = require('pino')
const log = pino(ecsFormat())

const myErr = new Error('boom')
log.info({ err: myErr }, 'oops')

will yield (pretty-printed for readability):

% node examples/error.js | jq .
{
  "log.level": "info",
  "@timestamp": "2021-01-26T17:02:23.697Z",
  ...
  "error": {
    "type": "Error",
    "message": "boom",
    "stack_trace": "Error: boom\n    at Object.<anonymous> (..."
  },
  "message": "oops"
}

This is analogous to and overrides Pino’s default err serializer. Special handling of the err field can be disabled via the convertErr: false option:

const log = pino(ecsFormat({ convertErr: false }))

HTTP Request and Response Loggingedit

With the convertReqRes: true option, the formatter will automatically convert Node.js core request and response objects when passed as the req and res fields, respectively. (This option replaces the usage of req and res Pino serializers.)

const http = require('http')
const ecsFormat = require('@elastic/ecs-pino-format')
const pino = require('pino')

const log = pino(ecsFormat({ convertReqRes: true })) 

const server = http.createServer(function handler (req, res) {
  res.setHeader('Foo', 'Bar')
  res.end('ok')
  log.info({ req, res }, 'handled request') 
})

server.listen(3000, () => {
  log.info('listening at http://localhost:3000')
}

use convertReqRes option

log with req and/or res fields

This will produce logs with request and response info using ECS HTTP fields. For example:

% node examples/http.js | jq .    # using jq for pretty printing
...                               # run 'curl http://localhost:3000/'
{
  "log.level": "info",
  "@timestamp": "2021-01-19T22:58:59.649Z",
  "ecs": {
    "version": "1.6.0"
  },
  "process": {
    "pid": 82670
  },
  "host": {
    "hostname": "pink.local"
  },
  "http": {
    "version": "1.1",
    "request": {
      "method": "GET",
      "headers": {
        "host": "localhost:3000",
        "accept": "*/*"
      }
    },
    "response": {
      "status_code": 200,
      "headers": {
        "foo": "Bar"
      }
    }
  },
  "url": {
    "full": "http://localhost:3000/",
    "path": "/"
  },
  "user_agent": {
    "original": "curl/7.64.1"
  },
  "message": "handled request"
}

The examples/ directory shows sample programs using request and response logging: with Express, with the pino-http middleware package, etc.

Integration with APM Tracingedit

This ECS log formatter integrates with Elastic APM. If your Node app is using the Node.js Elastic APM Agent, then fields are added to log records that identify an active trace and the configured service name ("service.name" and "event.dataset"). These fields allow cross linking between traces and logs in Kibana and support log anomaly detection.

For example, running examples/http-with-elastic-apm.js and curl -i localhost:3000/ results in a log record with the following:

% node examples/http-with-elastic-apm.js | jq .
...
  "service": {
    "name": "http-with-elastic-apm"
  },
  "event": {
    "dataset": "http-with-elastic-apm.log"
  },
  "trace": {
    "id": "a1c23e04d7174462f330f5921c8f036d"
  },
  "transaction": {
    "id": "2550c4c267d4dd53"
  },
  "span": {
    "id": "16a8704258dbc328"
  },
...

These IDs match trace data reported by the APM agent.

Integration with Elastic APM can be explicitly disabled via the apmIntegration: false option, for example:

const log = pino(ecsFormat({ apmIntegration: false }))

Limitations and Considerationsedit

The ecs-logging spec suggests that the first three fields in log records must be @timestamp, log.level, and message. Pino does not provide a mechanism to put the message field near the front. Given that ordering of ecs-logging fields is for human readability and does not affect interoperability, this is not considered a significant concern.

The hooks that Pino currently provides do not enable this package to convert fields passed to <logger>.child({ ... }). This means that, even with the convertReqRes option, a call to <logger>.child({ req }) will not convert that req to ECS HTTP fields. This is a slight limitation for users of pino-http which does this.