Hi,

The move is simple - LimitTokenCountAnalyzer is just a wrapper around any
other Analyzer, so I don't really understand your question - of course all
other analyzers are unlimited. If you have myAnalyzer with
myMaxFieldLengthValue used before, you can change your code as follows:
 
Before:
new IndexWriter(dir, new IndexWriterConfig(Version.LUCENE_34,
myAnalyzer).setFoo().setBar().setMaxFieldLength(myMaxFieldLengthValue));

After:
new IndexWriter(dir, new IndexWriterConfig(Version.LUCENE_34, new
LimitTokenCountAnalyzer(myAnalyzer,
myMaxFieldLengthValue)).setFoo().setBar());

You only have to do this on the indexing side, on the query side
(QueryParser) just use myAnalyzer without wrapping. With the new code, the
responsibilities for cutting the field after a specific number of tokens was
moved out out the indexing code in Lucene. This is now just an analysis
feature not a indexing feature anymore.

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: [email protected]

> -----Original Message-----
> From: Joe MA [mailto:[email protected]]
> Sent: Monday, November 28, 2011 8:09 AM
> To: [email protected]
> Subject: MaxFieldLength in Lucene 3.4
> 
> While upgrading to Lucene 3.4, I noticed the MaxFieldLength values on the
> indexers are deprecated.   There appears to be a LimitTokenCountAnalyzer
> that limits the tokens - so does that mean the default for all other
analyzers is
> unlimited?
> 
> Thanks in advance -
> JM


Reply via email to