Loading

Set up a data stream

Serverless Stack

The process of setting up a data stream in Elastic Stack and Elastic Cloud Serverless is similar, making use of their respective APIs. However, because Serverless provides a built-in data stream lifecycle mechanism and retention settings, you don't need to configure index lifecycle management (ILM) options as you do in an Elastic Stack deployment.

To set up a data stream, follow these steps:

  1. Create an index lifecycle policy Serverless Unavailable
  2. Create component templates
  3. Create an index template
  4. Create the data stream
  5. Secure the data stream

You can also convert an index alias to a data stream.

Important

If you use Fleet, Elastic Agent, or Logstash, skip this tutorial. They all set up data streams for you.

For Fleet and Elastic Agent, refer to Elastic Agent data streams for Fleet. For Logstash, refer to the data streams settings for the elasticsearch output plugin.

Serverless Unavailable

While optional, we recommend using the index lifecycle management (ILM) capability in Elastic Stack deployments to automate the management of your data stream’s backing indices. ILM requires an index lifecycle policy.

Simpler lifecycle management in Serverless projects

ILM lets you automatically transition indices through data tiers according to your performance needs and retention requirements. This allows you to balance hardware costs with performance. ILM is not available in Serverless, where performance optimizations are automatic. Instead, data stream lifecycle is available as a data management option.

To create an index lifecycle policy in Kibana:

  1. Go to the Index Lifecycle Policies management page using the navigation menu or the global search field.
  2. Click Create policy.

You can also use the create lifecycle policy API.

				PUT _ilm/policy/my-lifecycle-policy
					{
  "policy": {
    "phases": {
      "hot": {
        "actions": {
          "rollover": {
            "max_primary_shard_size": "50gb"
          }
        }
      },
      "warm": {
        "min_age": "30d",
        "actions": {
          "shrink": {
            "number_of_shards": 1
          },
          "forcemerge": {
            "max_num_segments": 1
          }
        }
      },
      "cold": {
        "min_age": "60d",
        "actions": {
          "searchable_snapshot": {
            "snapshot_repository": "found-snapshots"
          }
        }
      },
      "frozen": {
        "min_age": "90d",
        "actions": {
          "searchable_snapshot": {
            "snapshot_repository": "found-snapshots"
          }
        }
      },
      "delete": {
        "min_age": "735d",
        "actions": {
          "delete": {}
        }
      }
    }
  }
}
		

A data stream requires a matching index template. In most cases, you compose this index template using one or more component templates. You typically use separate component templates for mappings and index settings. This lets you reuse the component templates in multiple index templates.

When creating your component templates, include:

  • A date or date_nanos mapping for the @timestamp field. If you don’t specify a mapping, Elasticsearch maps @timestamp as a date field with default options.
  • Your lifecycle policy in the index.lifecycle.name index setting.
Tip

Use the Elastic Common Schema (ECS) when mapping your fields. ECS fields integrate with several Elastic Stack features by default.

If you’re unsure how to map your fields, use runtime fields to extract fields from unstructured content at search time. For example, you can index a log message to a wildcard field and later extract IP addresses and other data from this field during a search.

To create a component template in Kibana:

  1. Go to the Index Management page using the navigation menu or the global search field.
  2. In the Index Templates tab, click Create component template.

Use an API to create a component template:

To create a component template for mappings, use this request:

				PUT _component_template/my-mappings
					{
  "template": {
    "mappings": {
      "properties": {
        "@timestamp": {
          "type": "date",
          "format": "date_optional_time||epoch_millis"
        },
        "message": {
          "type": "wildcard"
        }
      }
    }
  },
  "_meta": {
    "description": "Mappings for @timestamp and message fields",
    "my-custom-meta-field": "More arbitrary metadata"
  }
}
		

To create a component template for index settings, use this request:

				PUT _component_template/my-settings
					{
  "template": {
    "settings": {
      "index.lifecycle.name": "my-lifecycle-policy"
    }
  },
  "_meta": {
    "description": "Settings for ILM",
    "my-custom-meta-field": "More arbitrary metadata"
  }
}
		

Use your component templates to create an index template. Specify:

  • One or more index patterns that match the data stream’s name. We recommend using our data stream naming scheme.
  • That the template is data stream enabled.
  • Any component templates that contain your mappings and index settings.
  • A priority higher than 200 to avoid collisions with built-in templates. See Avoid index pattern collisions.

To create an index template in Kibana:

  1. Go to the Index Management page using the navigation menu or the global search field.
  2. In the Index Templates tab, click Create template.

Use an API to create an index template:

Include the data_stream object to enable data streams:

				PUT _index_template/my-index-template
					{
  "index_patterns": ["my-data-stream*"],
  "data_stream": { },
  "composed_of": [ "my-mappings", "my-settings" ],
  "priority": 500,
  "_meta": {
    "description": "Template for my time series data",
    "my-custom-meta-field": "More arbitrary metadata"
  }
}
		

Indexing requests add documents to a data stream. These requests must use an op_type of create. Documents must include a @timestamp field.

To automatically create your data stream, submit an indexing request that targets the stream’s name. This name must match one of your index template’s index patterns.

				PUT my-data-stream/_bulk
					{ "create":{ } }
{ "@timestamp": "2099-05-06T16:21:15.000Z", "message": "192.0.2.42 - - [06/May/2099:16:21:15 +0000] \"GET /images/bg.jpg HTTP/1.0\" 200 24736" }
{ "create":{ } }
{ "@timestamp": "2099-05-06T16:25:42.000Z", "message": "192.0.2.255 - - [06/May/2099:16:25:42 +0000] \"GET /favicon.ico HTTP/1.0\" 200 3638" }
				POST my-data-stream/_doc
					{
  "@timestamp": "2099-05-06T16:21:15.000Z",
  "message": "192.0.2.42 - - [06/May/2099:16:21:15 +0000] \"GET /images/bg.jpg HTTP/1.0\" 200 24736"
}
		

You can also use an API to manually create the data stream:

				PUT _data_stream/my-data-stream
		

After it's been created, you can view and manage this and other data streams from the Index Management view. Refer to Manage a data stream for details.

Use index privileges to control access to a data stream. Granting privileges on a data stream grants the same privileges on its backing indices.

For an example, refer to Data stream privileges.

Prior to Elasticsearch 7.9, you’d typically use an index alias with a write index to manage time series data. Data streams replace this functionality, require less maintenance, and automatically integrate with data tiers.

You can convert an index alias with a write index to a data stream with the same name, using an API:

During conversion, the alias's indices become hidden backing indices for the stream. The alias's write index becomes the stream's write index. The stream still requires a matching index template with data stream enabled.

				POST _data_stream/_migrate/my-time-series-data
		

You can review metadata about each data stream using the Kibana UI (visual overview) or the API (raw JSON).

To get information about a data stream in Kibana:

  1. Go to the Index Management page using the navigation menu or the global search field.
  2. In the Data Streams tab, click the data stream’s name.
Tip Serverless Stack GA 9.2.0

Starting with Elasticsearch version 9.2, you can use the Streams page to view the details of a data stream. The Streams page provides a centralized interface for managing your data in Kibana.Select a stream to view its details.

You can also use an API to get this information:

				GET _data_stream/my-data-stream
		

You can delete a data stream and its backing indices via the Kibana UI or an API. To complete this action, you need the delete_index security privilege for the data stream.

To delete a data stream and its backing indices in Kibana:

  1. Go to the Index Management page using the navigation menu or the global search field.
  2. In the Data Streams view, click the trash can icon. The icon only displays if you have the delete_index security privilege for the data stream.

You can also use an API to delete a data stream:

				DELETE _data_stream/my-data-stream