Tokenizers

Tokenizers are used to break a string down into a stream of terms or tokens. A simple tokenizer might split the string up into terms wherever it encounters whitespace or punctuation.

Elasticsearch has a number of built in tokenizers which can be used to build custom analyzers.