Hi Piyush, I suppose your end goal is to search special chars too and I hope you are using it typeahead.
Keyword tokenizer keep the complete string as token. So when you search with partial it won't match. You could add the n-gram filter. Then output of keyword tokenizer will be broken in configured grams and that might help you. Please give it a try and let us know. Regards, Aman On Mon, Oct 8, 2018, 10:24 Rathor, Piyush (US - Philadelphia) < prat...@deloitte.com> wrote: > HI All, > > > > I am trying to use “KeywordTokenizerFactory” to consider searching against > the special characters in the search. > > But the partial search does not work well with “KeywordTokenizerFactory”. > > > > The partial match results are better in “StandardTokenizerFactory”. > > > > Field type – text_general > > > > Example for both scenarios : > > Partial search parameter: Nah' > > Expected result on top : Nah’bir > > > > Partial Search : shar > > Full Name : Sharma > > > > Please let me know if there is something that can be done to cater both > special characters and partial matches together. > > > > Thanks & Regards > > Piyush R > > > > This message (including any attachments) contains confidential information > intended for a specific individual and purpose, and is protected by law. If > you are not the intended recipient, you should delete this message and any > disclosure, copying, or distribution of this message, or the taking of any > action based on it, by you is strictly prohibited. > > v.E.1 >