ue, 3/15/11, Otis Gospodnetic wrote:
> From: Otis Gospodnetic
> Subject: Re: Tokenizing Chinese & multi-language search
> To: solr-user@lucene.apache.org
> Date: Tuesday, March 15, 2011, 11:51 PM
> Hi Andy,
>
> Is the "I don't know what language the query i
ext.com/ :: Solr - Lucene - Nutch
Lucene ecosystem search :: http://search-lucene.com/
- Original Message
> From: Andy
> To: solr-user@lucene.apache.org
> Sent: Tue, March 15, 2011 9:07:36 PM
> Subject: Tokenizing Chinese & multi-language search
>
> Hi,
>
> I rem
Hi,
I remember reading in this list a while ago that Solr will only tokenize on
whitespace even when using CJKAnalyzer. That would make Solr unusable on
Chinese or any other languages that don't use whitespace as separator.
1) I remember reading about a workaround. Unfortunately I can't find th