Haven't really looked much into that, here is a snipped form todays gc log,
if you wouldn't mind shedding any details on it:
2017-08-03T11:46:16.265-0400: 3200938.383: [GC (Allocation Failure)
2017-08-03T11:46:16.265-0400: 3200938.383: [ParNew
Desired survivor size 1966060336 bytes, new threshold
How long are your GC pauses? Those affect all queries, so they make the 99th
percentile slow with queries that should be fast.
The G1 collector has helped our 99th percentile.
wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/ (my blog)
> On Aug 3, 2017, at 8:48 AM,
Thanks, thats what i kind of expected. still debating whether the space
increase is worth it, right now Im at .7% of searches taking longer than 10
seconds, and 6% taking longer than 1, so when i see things like this in the
morning it bugs me a bit:
2017-08-02 11:50:48 : 58979/1000 secs : ("Rules
bq: will that search still return results form the earlier documents
as well as the new ones
In a word, "no". By definition the analysis chain applied at index
time puts tokens in the index and that's all you have to search
against for the doc unless and until you re-index the document.
You reall
Hey all, I have yet to run an experiment to test this but was wondering if
anyone knows the answer ahead of time.
If i have an index built with documents before implementing the commongrams
filter, then enable it, and start adding documents that have the
filter/tokenizer applied, will searches that