On 4/14/2016 1:25 PM, Betsey Benagh wrote:
> bin/solr status shows the memory usage increasing, as does the admin ui.
>
> I¹m running this on a shared machine that is supporting several other
> applications, so I can¹t be particularly greedy with memory usage.  Is
> there anything out there that gives guidelines on what an appropriate
> amount of heap is based on number of documents or whatever?  We¹re just
> playing around with it right now, but it sounds like we may need a
> different machine in order to load in all of the data we want to have
> available.

That means you're seeing the memory usage from Java's point of view. 
There will be three numbers in the admin UI.  The first is the actual
amount of memory used by the program right at that instant.  The second
is the highest amount of memory that has ever been allocated since the
program started.  The third is the maximum amount of memory that *can*
be allocated.  It's normal for the last two numbers to be the same and
the first number to fluctuate up and down.

>From the operating system's point of view, the program will be using the
amount from the middle number on the admin UI, plus some overhead for
Java itself.

https://wiki.apache.org/solr/SolrPerformanceProblems#How_much_heap_space_do_I_need.3F

In addition to having enough heap memory, getting good performance will
require that you have additional memory in the system that is not
allocated to ANY program, which the OS can use to cache your index
data.  The total amount of memory that a well-tuned Solr server requires
often surprises people.  Running Solr with other applications on the
same server may not be a problem if your Solr server load is low and
your indexes are very small, but if your indexes are large and/or Solr
is very busy, those other applications might interfere with Solr
performance.

Thanks,
Shawn

Reply via email to