IMPORTANT: No additional bug fixes or documentation updates
will be released for this version. For the latest information, see the
current release documentation.
Keyword Repeat Token Filter
edit
IMPORTANT: This documentation is no longer updated. Refer to Elastic's version policy and the latest documentation.
Keyword Repeat Token Filter
editThe keyword_repeat token filter Emits each incoming token twice once
as keyword and once as a non-keyword to allow an unstemmed version of a
term to be indexed side by side with the stemmed version of the term.
Given the nature of this filter each token that isn’t transformed by a
subsequent stemmer will be indexed twice. Therefore, consider adding a
unique filter with only_on_same_position set to true to drop
unnecessary duplicates.
Here is an example:
index :
analysis :
analyzer :
myAnalyzer :
type : custom
tokenizer : standard
filter : [lowercase, keyword_repeat, porter_stem, unique_stem]
unique_stem:
type: unique
only_on_same_position : true