Hi,
expungesDeletes (default false) is not done automatically through SolrJ.
Please see : https://issues.apache.org/jira/browse/SOLR-1487
During segment merge, deleted terms purged. Thats why problem solved by itself.
Ahmet
On Tuesday, March 11, 2014 4:07 PM, epnRui wrote:
Hi Ahmet,
I th
Hi;
I suggest you to look at the source code. NGramTokenizer.java has some
explanations as comments and it may help you.
Thanks;
Furkan KAMACI
2014-03-11 16:06 GMT+02:00 epnRui :
> Hi Ahmet,
>
> I think the expungesDelete is done automatically through SolrJ. So I don't
> think it was that.
> T
Hi Ahmet,
I think the expungesDelete is done automatically through SolrJ. So I don't
think it was that.
THe problem solved by itself apparently. I wonder if it has to do with an
automatic optimization of Solr indexes?
Otherwise it was something similar to XY problem :P
Thanks for the help!
--
Hi,
After you delete your document, did you commit with expungeDeletes=true?
Also please see : https://people.apache.org/~hossman/#xyproblem
Ahmet
On Friday, March 7, 2014 1:16 PM, epnRui wrote:
Hi iorixxx!
Thanks for replying. I managed to get around well enough not to need a
tokenizer cu
Hi iorixxx!
Thanks for replying. I managed to get around well enough not to need a
tokenizer customized implementation. That would be a pain in ...
Anyway, now I have another problem, which is related to the following:
- I had previously used replace chars and replace patterns, charfilters and
Hi Rui,
I think ClassicTokenizerImpl.jflex file is good start for understanding
tokenizers.
http://svn.apache.org/repos/asf/lucene/dev/trunk/lucene/analysis/common/src/java/org/apache/lucene/analysis/standard/ClassicTokenizerImpl.jflex
Please see other *.jflex files in source tree.
But usuall