Ok. I might get what you are looking for. Extends SolrTestCase4J (see
plenty samples in codebase). Obtain request via req(), obtain schema from
it by getSchema(), then getAnalyzer() or getQueryAnalyzer() and ask for
analysis org.apache.lucene.analysis.Analyzer.tokenStream(String, Reader).
You'll fi
That's exactly the way I do it when I have to write some custom stuff.
My problem is that I do not know how to integrate an Analyzer's
reusability-feature into a Unit-Test to see what happens if - i.e. - a
TokenFilter-instance is going to be reused.
Some TokenFilter-prototypes I've seen are state
It's not clear what you want to achieve. I don't always create custom
TokenStreams, but if I do I use Lucenes as a prototype to start from.
On Mon, Oct 1, 2012 at 6:07 PM, Em wrote:
> Hi Mikhail,
>
> thanks for your feedback.
>
> If so, how can I write UnitTests which respect the Reuse strategy?
Hi Mikhail,
thanks for your feedback.
If so, how can I write UnitTests which respect the Reuse strategy?
What's the recommended way when creating custom Tokenizers and TokenFilters?
Kind regards,
Em
Am 01.10.2012 10:54, schrieb Mikhail Khludnev:
> Hello,
>
> Analyzers are reused. Analyzer is T
Hello,
Analyzers are reused. Analyzer is Tokenizer and several TokenFilters. Check
the source org.apache.lucene.analysis.Analyzer, pay attention to
reuseStrategy.
Best regards
On Sun, Sep 30, 2012 at 5:37 PM, Em wrote:
> Hello list,
>
> I saw a bug in a TokenFilter that only works, if there is
Hello list,
I saw a bug in a TokenFilter that only works, if there is a fresh
instance created by the TokenFilterFactory and it seems as TokenFilters
are reused some how for more than one request.
So, if your TokenFilterFactory has a Logging-Statement in its
create()-method, you see that log only