Create and upload config.yaml to S3 bucketedit

Elastic Serverless Forwarder requires a config.yaml file to be uploaded to an S3 bucket and referenced by the S3_CONFIG_FILE environment variable.

Save the following YAML content as config.yaml and edit as required before uploading to an S3 bucket. You should remove any inputs or arguments you are not using, and ensure you have entered the correct URLs and credentials as per the inline comments.

inputs:
  - type: "s3-sqs"
    id: "arn:aws:sqs:%REGION%:%ACCOUNT%:%QUEUENAME%"
    outputs:
      - type: "elasticsearch"
        args:
          # either elasticsearch_url or cloud_id, elasticsearch_url takes precedence if both are included
          elasticsearch_url: "http(s)://domain.tld:port"
          cloud_id: "cloud_id:bG9jYWxob3N0OjkyMDAkMA=="
          # either api_key or username/password, username/password takes precedence if both are included
          api_key: "YXBpX2tleV9pZDphcGlfa2V5X3NlY3JldAo="
          username: "username"
          password: "password"
          es_datastream_name: "logs-generic-default"
          batch_max_actions: 500 # optional: default value is 500
          batch_max_bytes: 10485760 # optional: default value is 10485760
      - type: "logstash"
        args:
          logstash_url: "http(s)://host:port"
          username: "username" #optional
          password: "password" #optional
          max_batch_size: 500 #optional
          compression_level: 1 #optional
          ssl_assert_fingerprint: "22:F7:FB:84:1D:43:3E:E7:BB:F9:72:F3:D8:97:AD:7C:86:E3:08:42" #optional
  - type: "sqs"
    id: "arn:aws:sqs:%REGION%:%ACCOUNT%:%QUEUENAME%"
    outputs:
      - type: "elasticsearch"
        args:
          # either elasticsearch_url or cloud_id, elasticsearch_url takes precedence if both are included
          elasticsearch_url: "http(s)://domain.tld:port"
          cloud_id: "cloud_id:bG9jYWxob3N0OjkyMDAkMA=="
          # either api_key or username/password, username/password takes precedence if both are included
          api_key: "YXBpX2tleV9pZDphcGlfa2V5X3NlY3JldAo="
          username: "username"
          password: "password"
          es_datastream_name: "logs-generic-default"
          batch_max_actions: 500 # optional: default value is 500
          batch_max_bytes: 10485760 # optional: default value is 10485760
      - type: "logstash"
        args:
          logstash_url: "http(s)://host:port"
          username: "username" #optional
          password: "password" #optional
          max_batch_size: 500 #optional
          compression_level: 1 #optional
          ssl_assert_fingerprint: "22:F7:FB:84:1D:43:3E:E7:BB:F9:72:F3:D8:97:AD:7C:86:E3:08:42" #optional
  - type: "kinesis-data-stream"
    id: "arn:aws:kinesis:%REGION%:%ACCOUNT%:stream/%STREAMNAME%"
    outputs:
      - type: "elasticsearch"
        args:
          # either elasticsearch_url or cloud_id, elasticsearch_url takes precedence if both are included
          elasticsearch_url: "http(s)://domain.tld:port"
          cloud_id: "cloud_id:bG9jYWxob3N0OjkyMDAkMA=="
          # either api_key or username/password, username/password takes precedence if both are included
          api_key: "YXBpX2tleV9pZDphcGlfa2V5X3NlY3JldAo="
          username: "username"
          password: "password"
          es_datastream_name: "logs-generic-default"
          batch_max_actions: 500 # optional: default value is 500
          batch_max_bytes: 10485760 # optional: default value is 10485760
      - type: "logstash"
        args:
          logstash_url: "http(s)://host:port"
          username: "username" #optional
          password: "password" #optional
          max_batch_size: 500 #optional
          compression_level: 1 #optional
          ssl_assert_fingerprint: "22:F7:FB:84:1D:43:3E:E7:BB:F9:72:F3:D8:97:AD:7C:86:E3:08:42" #optional
  - type: "cloudwatch-logs"
    id: "arn:aws:logs:%AWS_REGION%:%AWS_ACCOUNT_ID%:log-group:%LOG_GROUP_NAME%:*"
    outputs:
      - type: "elasticsearch"
        args:
          # either elasticsearch_url or cloud_id, elasticsearch_url takes precedence if both are included
          elasticsearch_url: "http(s)://domain.tld:port"
          cloud_id: "cloud_id:bG9jYWxob3N0OjkyMDAkMA=="
          # either api_key or username/password, username/password takes precedence if both are included
          api_key: "YXBpX2tleV9pZDphcGlfa2V5X3NlY3JldAo="
          username: "username"
          password: "password"
          es_datastream_name: "logs-generic-default"
          batch_max_actions: 500 # optional: default value is 500
          batch_max_bytes: 10485760 # optional: default value is 10485760
      - type: "logstash"
        args:
          logstash_url: "http(s)://host:port"
          username: "username" #optional
          password: "password" #optional
          max_batch_size: 500 #optional
          compression_level: 1 #optional
          ssl_assert_fingerprint: "22:F7:FB:84:1D:43:3E:E7:BB:F9:72:F3:D8:97:AD:7C:86:E3:08:42" #optional
  - type: "cloudwatch-logs"
    id: "arn:aws:logs:%AWS_REGION%:%AWS_ACCOUNT_ID%:log-group:%LOG_GROUP_NAME%:log-stream:%LOG_STREAM_NAME%"
    outputs:
      - type: "elasticsearch"
        args:
          # either elasticsearch_url or cloud_id, elasticsearch_url takes precedence if both are included
          elasticsearch_url: "http(s)://domain.tld:port"
          cloud_id: "cloud_id:bG9jYWxob3N0OjkyMDAkMA=="
          # either api_key or username/password, username/password takes precedence if both are included
          api_key: "YXBpX2tleV9pZDphcGlfa2V5X3NlY3JldAo="
          username: "username"
          password: "password"
          es_datastream_name: "logs-generic-default"
          batch_max_actions: 500 # optional: default value is 500
          batch_max_bytes: 10485760 # optional: default value is 10485760
      - type: "logstash"
        args:
          logstash_url: "http(s)://host:port"
          username: "username" #optional
          password: "password" #optional
          max_batch_size: 500 #optional
          compression_level: 1 #optional
          ssl_assert_fingerprint: "22:F7:FB:84:1D:43:3E:E7:BB:F9:72:F3:D8:97:AD:7C:86:E3:08:42" #optional
Fieldsedit

inputs.[]:

A list of inputs (i.e. triggers) for the Elastic Serverless Forwarder Lambda function.

inputs.[].type:

The type of trigger input (cloudwatch-logs, kinesis-data-stream, sqs and s3-sqs are currently supported).

inputs.[].id:

The ARN of the trigger input according to the type. Multiple input entries can have different unique ids with the same type. Inputs of type cloudwatch-logs accept both CloudWatch Logs Log Group and CloudWatch Logs Log Stream ARNs.

inputs.[].outputs:

A list of outputs (i.e. forwarding targets) for the Elastic Serverless Forwarder Lambda function. You can have multiple outputs for an input, but only one output can be defined per type.

inputs.[].outputs.[].type:

The type of the forwarding target output. Currently only the following outputs are supported:

  • elasticsearch
  • [preview] This functionality is in technical preview and may be changed or removed in a future release. Elastic will work to fix any issues, but features in technical preview are not subject to the support SLA of official GA features. logstash

If Logstash is chosen as an output, Elastic Serverless Forwarder expects the elastic_serverless_forwarder Logstash input to be installed, enabled, and properly configured. For more information about installing Logstash plugins, refer to the Logstash documentation.

inputs.[].outputs.[].args: Custom init arguments for the specified forwarding target output.

For elasticsearch the following arguments are supported:

  • args.elasticsearch_url: URL of elasticsearch endpoint in the format http(s)://domain.tld:port. Mandatory when args.cloud_id is not provided. Will take precedence over args.cloud_id if both are defined.
  • args.cloud_id: Cloud ID of elasticsearch endpoint. Mandatory when args.elasticsearch_url is not provided. Will be ignored if args.elasticsearch_url is defined.
  • args.username: Username of the elasticsearch instance to connect to. Mandatory when args.api_key is not provided. Will take precedence over args.api_key if both are defined.
  • args.password Password of the elasticsearch instance to connect to. Mandatory when args.api_key is not provided. Will take precedence over args.api_key if both are defined.
  • args.api_key: API key of elasticsearch endpoint in the format base64encode(api_key_id:api_key_secret). Mandatory when args.username and args.password are not provided. Will be ignored if args.username/args.password are defined.
  • args.es_datastream_name: Name of data stream or index where logs should be forwarded to. Lambda supports automatic routing of various AWS service logs to the corresponding data streams for further processing and storage in the Elasticsearch cluster. It supports automatic routing of aws.cloudtrail, aws.cloudwatch_logs, aws.elb_logs, aws.firewall_logs, aws.vpcflow, and aws.waf logs. For other log types, if using data streams, you can optionally set its value in the configuration file according to the naming convention for data streams and available integrations. If the es_datastream_name is not specified and it cannot be matched with any of the above AWS services, then the value will be set to logs-generic-default. In versions v0.29.1 and below, this configuration parameter was named es_index_or_datastream_name. Rename the configuration parameter to es_datastream_name in your config.yaml file on the S3 bucket to continue using it in the future version. The older name es_index_or_datastream_name is deprecated as of version v0.30.0. The related backward compatibility code is removed from version v1.0.0.
  • args.batch_max_actions: (Optional) Maximum number of actions to send in a single bulk request. Default value: 500.
  • args.batch_max_bytes: (Optional) Maximum size in bytes to send in a single bulk request. Default value: 10485760 (10MB).
  • args.ssl_assert_fingerprint: (Optional) SSL fingerprint for self-signed SSL certificate on HTTPS transport.

For logstash the following arguments are supported:

  • args.logstash_url: URL of Logstash endpoint in the format http(s)://host:port
  • args.username: (Optional) Username of the Logstash instance to connect to. Mandatory if HTTP Basic authentication is enabled in Logstash.
  • args.password: (Optional) Password of the Logstash instance to connect to. Mandatory if HTTP Basic authentication is enabled in Logstash.
  • args.max_batch_size: (Optional) Maximum number of events to send in a single HTTP(s) request. Default value: 500
  • args.compression_level: (Optional) The GZIP compression level for HTTP(s) requests towards Logstash. It can be any integer value between 1 (minimum compression, best performance, highest amount of bytes sent) and 9 (maximum compression, worst performance, lowest amount of bytes sent). Default value: 1
  • args.ssl_assert_fingerprint: (Optional) SSL fingerprint for self-signed SSL certificate on HTTPS transport.