WARNING: Version 0.90 of Elasticsearch has passed its EOL date.
This documentation is no longer being maintained and may be removed. If you are running this version, we strongly advise you to upgrade. For the latest information, see the current release documentation.
There are several token filters available which try to normalize special characters of a certain language.
You can currently choose between
persian_normalization normalization in your token filter
configuration. For more information check the
Note: These filters are available since