On 6/4/2015 7:38 AM, Midas A wrote:
> On Thu, Jun 4, 2015 at 6:48 PM, Shawn Heisey <apa...@elyograg.org> wrote:
>
>> On 6/4/2015 5:15 AM, Midas A wrote:
>>> I have some indexing issue . While indexing IOwait is high in solr server
>>> and load also.
>> My first suspect here is that you don't have enough RAM for your index
>> size.
>>
>> * How many total docs is Solr handling (all cores)?
>>
>      --30,00000 dos
>
>> * What is the total size on disk of all your cores?
>>
>      --  600 GB
>
>> * How much RAM does the machine have?
>>
>      --48 GB
>
>> * What is the java max heap?
>> --30 GB(jvm)

Is that 3 million docs or 30 million docs?  The actual numbers are 3
million, but you put a single comma in the number after the 30, so I am
not sure which you meant.  Either way, those documents must be quite
large, to make a 600GB index.  30 million docs in my index would only be
about 30GB.

With 48 GB of RAM, 30 GB allocated to Solr, and a 600GB index, you don't
have anywhere even close to enough RAM to cache your index effectively. 
There's only 18GB of RAM left over for the OS disk cache.  That's only 3
percent of the index data that can fit in the OS disk cache.  I would
imagine that you're going to need to be able to fit somewhere between 25
and 50 percent of the index into RAM, which would mean that you're going
to want around 256GB of RAM for that index. 128GB *might* be enough. 
Alternatively, you could work on making your index smaller -- but be
aware that to improve performance with low memory, you need to reduce
the *indexed* part, the *stored* part makes little difference.

Another potential problem with a 30GB heap is related to garbage
collection tuning.  If you haven't tuned your GC at all, then
performance will be terrible on a heap that large, especially when you
are indexing.  The wiki page I linked on my previous reply contains a
link to my personal page, which covers GC tuning:

https://wiki.apache.org/solr/ShawnHeisey

Thanks,
Shawn

Reply via email to