Lucene, the underlying search engine library, imposes this 32K limit for
individual terms. Use tokenized text instead.

-- Jack Krupansky

On Thu, Jun 25, 2015 at 8:36 PM, Mike Thomsen <[email protected]>
wrote:

> I need to be able to do exact phrase searching on some documents that are a
> few hundred kb when treated as a single block of text. I'm on 4.10.4 and it
> complains when I try to put something larger than 32kb in using a textfield
> with the keyword tokenizer as the tokenizer. Is there any way I can index
> say a 500kb block of text like this?
>
> Thanks,
>
> Mike
>

Reply via email to