Hi,

Just summarizing:
I've experimented using different sized of filtercache and documentcache,
after removing any maxRamMB.  Now the heap seems to behave as expected,
that is, it grows, then GC (not full one) kicks in multiple times and keep
the used heap under control. eventually full GC may kick in and the size
goes down a little more.

Previously, when I had maxRamMB specified, the heap would grow considerably
(for a search returning about 300K docs) and after that it would not go
down again (and those docs were never again requested). This did not work
well.

I looked at the heapdump and saw all the caches (filter, document, one type
per core), so if you have multiple shards you may have to be very careful
not to increase the cache sizes, because they apply to each core.

I still think there is something strange when a search returns a large
number of docs - the G1GC didn't seem to handle that very well in some
cases (when maxRamMB was specified), but that may be the symptom and not
the cause.
Thanks for the help.

Reinaldo

On Sat, Jun 27, 2020 at 4:29 AM Zisis T. <zist...@runbox.com> wrote:

> Hi Reinaldo,
>
> Glad that helped. I've had several sleepless nights with Solr clusters
> failing spectacularly in production due to that but I still cannot say that
> the problem is completely away.
>
> Did you check in the heap dump if you have cache memory leaks as described
> in https://issues.apache.org/jira/browse/SOLR-12743?
>
> Say you have 4 cache instances (filterCache, documentCache etc) per core
> and
> you have 5 Solr cores you should not see more than 20 CaffeineCache
> instances in your dump.
>
> Unfortunately I still cannot determine what exactly triggers this memory
> leak although since I removed the maxRAMMB setting I've not seen similar
> behavior for more than a month now in production.
>
> The weird thing is that I was running on Solr 7.5.0 for quite some time
> without any issues and it was at some point in time that those problems
> started appearing...
>
>
>
> --
> Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html
>

Reply via email to