Analysis changesedit

The delimited_payload_filter is renamededit

The delimited_payload_filter is renamed to delimited_payload, the old name is deprecated and will be removed at some point, so it should be replaced by delimited_payload.

Limiting the number of tokens produced by _analyzeedit

To safeguard against out of memory errors, the number of tokens that can be produced using the _analyze endpoint has been limited to 10000. This default limit can be changed for a particular index with the index setting index.analyze.max_token_count.

Limiting the length of an analyzed text during highlightingedit

Highlighting a text that was indexed without offsets or term vectors, requires analysis of this text in memory real time during the search request. For large texts this analysis may take substantial amount of time and memory. To protect against this, the maximum number of characters that will be analyzed has been limited to 10000. This default limit can be changed for a particular index with the index setting index.highlight.max_analyzed_offset.