On 3/11/2013 11:14 AM, Shawn Heisey wrote:
On 3/10/2013 8:00 PM, jimtronic wrote:
I'm having trouble finding some problems while load testing my setup.
If you saw these numbers on your dashboard, would they worry you?
Physical Memory 97.6%
14.64 GB of 15.01 GB
File Descriptor Count 19.1%
196 of 1024
JVM-Memory 95%
1.67 GB (dark gray)
1.76 GB (med gray)
1.76 GB
What OS? If it's a unix/linux environment, the full output of the
'free' command will be important. Generally speaking, it's normal for
any computer (client or server, regardless of OS) to use all available
memory when under load.
Replying to myself. The cold must be getting to me. :)
If nothing else is running on this server except for Solr, and your
index is less than 15GB in size, these numbers would not worry me at
all. If your index is less than 30GB in size, you might still be OK,
but at that point your index would exceed available RAM. Chances are
that you would be able to cache enough of it for good performance,
depending on your schema. The reason that I say this is that you have
about 2GB of RAM give to Solr, leaving about 13-14GB for OS disk caching.
If the server is shared with other things, particularly a busy database
or busy web server, then the above paragraph might not apply - you may
not have enough resources for Solr to work effectively.
Thanks,
Shawn