On May 6, 2008, at 4:00 AM, Mike Klaas wrote:
On 3-May-08, at 10:06 AM, Daniel Andersson wrote:
How do I optimize Solr to better use all the RAM? I'm using java6,
64bit version, and start Solr using:
java -Xmx7500M -Xms4096M -jar start.jar
But according to top it only seems to be using 7.7% of the memory
(around 600 MB).
Don't try to give Solr _all_ the memory on the system. Solr depends
on the index existing in the OS's disk cache (this is "cached" in
top). You should have at least 2 GB memory for a 3.5GB index,
depending on how much of the index is stored (best is of course to
have 3.5GB available so it can be cached completely).
Solr will require a wide distribution of queries to "warm up" (get
the index in the OS disk cache). This is automatically prioritize
the "hot spots" in the index. If you want to load the whole thing
'cd datadir; cat * > /dev/null' works, but I don't recommend relying
on that.
Ah. Have given it 4 GB of RAM now (Xmx=4 GB, Xms=2 GB)
Most queries are for make_id + model_id or city + state and almost
all of the queries are ordered by datetime_found (newest -> oldest).
How many documents match, typically? How many documents are
returned, typically? How often do you commit() [I suspect
frequently, based on the problems you are having]?
Average documents matched/found: 6427
Only return 10 documents per page
Commit every 10,000 documents. Tried it at 100,000 with 2 GB of ram (1
GB dedicated to Solr) and it just gave me OutOfMemory every time.
Haven't tried increasing it since moving it to this new server.
Cheers,
Daniel