Shawn,

On 7/4/22 13:31, Shawn Heisey wrote:
On 7/4/22 03:01, Mike wrote:
My Solr index size is around 500GB and I have 64GB of RAM. Solr eats up all the memory and because of that PHP works very, very slowly. What can I do?

Solr is a Java program.  A Java program will never directly use more memory than you specify for the max heap size.

Uhh....

We cannot make any general recommendations about what heap size you need, because there is a good chance that any recommendation we make would be completely wrong for your install.  I did see that someone recommended not going above 31G ... and this is good advice.  At 32 GB, Java switches to 64-bit pointers instead of 32-bit.  So a heap size of 32 GB actually has LESS memory available than a heap size of 31 GB.

Well, if you need more than 32GiB, I think the recommendation is to go MUCH HIGHER than 32GiB. If you have a 48GiB machine, maybe restrict to 31GiB of heap, but if you have a TiB, go for it :)

The OS will use additional memory beyond the heap for caching the index data, but that is completely outside of Solr's control.

This is why I said "uhh..." above: the JVM needs more memory than the heap. Sometimes as much as twice that amount, depending upon the workload of the application itself. Measure, measure, measure.

Note that 64GB total memory for a 500GB index is almost certainly not
enough memory, ESPECIALLY if the same server is used for things other
than Solr.

I'm in interested to know what the relation is between on-disk index side and in-memory index size. I would imagine that the on-disk artifacts are fairly slim (only storing what is necessary) and the in-memory representation has all kinds of "waste" (like pointers and all that). Has anyone done a back-of-the-napkin calculation to guess at the in-memory size of an index given the on-disk representation?

-chris

Reply via email to