You're doing nothing wrong, that particular bit of advice has
always needed a bit of explanation.

Solr (well, actually Lucene) uses MMapDirectory for much of
the index structure which uses the OS memory rather than
the JVM heap. See Uwe's excellent:

http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.html

Plus, the size on disk includes the stored data, which is in the *.fdt
files in data/index. Very little of the stored data is kept in the JVM
so that's another reason your Java heap may be smaller than
your raw index size on disk.

The advice about fitting your entire index into memory really has
the following caveats (at least).
1> "memory" includes the OS memory available to the process
2> The size of the index on disk is misleading, the *.fdt files
     should be subtracted in order to get a truer picture.
3> Both Solr and Lucene create structures in the Java JVM
     that are _not_ reflected in the size on disk.

<1> and <2> mean the JVM memory necessary is smaller
than the size on disk.

<3> means the JVM memory will be larger than.

So you're doing the right thing, testing and seeing what you
_really_ need. I'd pretty much take your test, add some
padding and consider it good. You're _not_ doing the
really bad thing of using the same query over and over
again and hoping <G>.

Best,
Erick


On Tue, Dec 8, 2015 at 11:54 AM, Steven White <swhite4...@gmail.com> wrote:
> Hi folks,
>
> My index size on disk (optimized) is 20 GB (single core, single index).  I
> have a system with 64 GB of RAM.  I start Solr with 24 GB of RAM.
>
> I have run load tests (up to 100 concurrent users) for hours where each
> user issuing unique searches (the same search is never executed again for
> at least 30 minute since it was last executed).  In all tests I run, Solr's
> JVM memory never goes over 10 GB (monitoring http://localhost:8983/).
>
> I read over and over, for optimal performance, Solr should be given enough
> RAM to hold the index in memory.  Well, I have done that and some but yet I
> don't see Solr using up that whole RAM.  What am I doing wrong?  Is my test
> at fault?  I doubled the test load (number of users) and didn't see much of
> a difference with RAM usage but yet my search performance went down (takes
> about 40% longer now).  I run my tests again but this time with only 12 GB
> of RAM given to Solr.  Test result didn't differ much from the 24 GB run
> and Solr never used more than 10 GB of RAM.
>
> Can someone help me understand this?  I don't want to give Solr RAM that it
> won't use.
>
> PS: This is simply search tests, there is no update to the index at all.
>
> Thanks in advanced.
>
> Steve

Reply via email to