WhitespaceTokenizer extends CharTokenizer, which has a hard-coded token length 
limit of 256 chars.  I think adding configurability for this should be fairly 
simple.  Patches welcome!

Steve

On Feb 7, 2013, at 8:14 AM, prakash_m16 <prakash_...@yahoo.com> wrote:

> Hi ,
> 
> I would like to know how to increase the token length for
> whitespacetokenizer. Default length seems to  be 256.
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Increase-Token-length-for-white-space-tokenizer-tp4038981.html
> Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to