Thanks Shawn,

It's more inquisitiveness now more than anything.

http://web.lavoco.com/top.png

(forgot to mention mariadb on there too  :)



On 26/05/17 16:20, Shawn Heisey wrote:
On 5/26/2017 11:01 AM, Robert Brown wrote:
Let's assume I can't get more RAM - why would an index of no more than
1MB (on disk) need so much?

(without getting into why I'm using Solr on such a small index in the
first place  :)

My docs consist of 3 text fields for searching, all others are
strings/ints for facets and filtering, about 20 fields in total.

Currently just 500 docs.
If Solr were the only thing on the server, I feel fairly confident that
you would not be having any problems.  Although your heap is only at
256MB, Java itself requires memory to run, and that memory may be even
larger than 256MB.

Webservers, particularly if they are running in a forked-process
paradigm rather than a multi-threaded paradigm (common with Apache),
tend to be VERY memory hungry.  I assume that nginx is threaded, but
although a threaded webserver uses less memory than a forked-process
webserver, a busy site is still going to eat up a lot of memory.  With
only 2GB of memory, you should be limiting the number of idle
threads/processes the webserver will keep around, and you might want to
limit the number of simultaneous connections the webserver allows.

Your perl webapp is a complete unknown where memory usage is concerned.

If you run top, press shift-M to sort by memory, grab a screenshot, and
put that screenshot somewhere we can access it by URL, I'll be able to
see the overall memory usage of the server and at least tell you what's
happening.

The best recommendation I can make, even without that top screenshot, is
to add memory to the server, or to get a second server and dedicate it
to Solr.

Thanks,
Shawn


Reply via email to