It's not clear what you want to achieve. I don't always create custom
TokenStreams, but if I do I use Lucenes as a prototype to start from.

On Mon, Oct 1, 2012 at 6:07 PM, Em <[email protected]> wrote:

> Hi Mikhail,
>
> thanks for your feedback.
>
> If so, how can I write UnitTests which respect the Reuse strategy?
> What's the recommended way when creating custom Tokenizers and
> TokenFilters?
>
> Kind regards,
> Em
>
> Am 01.10.2012 10:54, schrieb Mikhail Khludnev:
> > Hello,
> >
> > Analyzers are reused. Analyzer is Tokenizer and several TokenFilters.
> Check
> > the source org.apache.lucene.analysis.Analyzer, pay attention to
> > reuseStrategy.
> >
> > Best regards
> >
> > On Sun, Sep 30, 2012 at 5:37 PM, Em <[email protected]>
> wrote:
> >
> >> Hello list,
> >>
> >> I saw a bug in a TokenFilter that only works, if there is a fresh
> >> instance created by the TokenFilterFactory and it seems as TokenFilters
> >> are reused some how for more than one request.
> >>
> >> So, if your TokenFilterFactory has a Logging-Statement in its
> >> create()-method, you see that log only now and again - but not on every
> >> request.
> >>
> >> Is this a bug in Solr 4.0-BETA or is this expected behaviour?
> >> If it is expected, what could be wrong with the TokenFilter?
> >>
> >> Kind regards,
> >> Em
> >>
> >
> >
> >
>



-- 
Sincerely yours
Mikhail Khludnev
Tech Lead
Grid Dynamics

<http://www.griddynamics.com>
 <[email protected]>

Reply via email to