On 2/25/2013 4:06 AM, zqzuk wrote:
Hi

I am really frustrated by this problem.

I have built an index of 1.5 billion data records, with a size of about
170GB. It's been optimised and has 12 separate files in the index directory,
looking like below:

_2.fdt --- 58G
_2.fdx --- 80M
_2.fnm--- 900bytes
_2.si   --- 380bytes
_2.lucene41_0.doc --- 46G
_2_Lucene41_0.pos --- 22G
_2_Lucene41_0.tim --- 37G
_2_Lucene41_0.tip ---  766MB
_2_nrm.cfe  --- 139byte
_2_nrm.cfs  --- 5.7G
segments.gen  --  20byte
segments_1 ---    68byte


It sits on a single server with a memory of 32G allocated to it, using
default solr setting that are provided with the solr example in the
distrubtion.

I started the server ok with 32G memory, but any query other than "q=*:*"
fails, with the following out of memory exception:

When you say 32GB, are you saying that you have allocated 32GB to the Java heap, or that the machine has 32GB of total system RAM? 32GB total RAM would not be enough for this index.

If you are saying that 32GB is the Java heap, I would expect that to be OK - as long as the machine has at least 96GB of total system RAM (256GB would be ideal), and your Solr caches are not enormous.

Is this a 64-bit operating system with a 64-bit Java? If it's not, then the largest java heap you can allocate would be 2GB, which would definitely not be enough.

The other reply that just came in from Timothy Potter is also correct. Put more memory on each machine and use more machines for better results.

Thanks,
Shawn

Reply via email to