On 6/5/2013 3:08 AM, Raheel Hasan wrote:
> Hi,
> 
> I am trying to index a heavy dataset with 1 particular field really too
> heavy...
> 
> However, As I start, I get Memory warning and rollback (OutOfMemoryError).
> So, I have learned that we can use -Xmx1024m option with java command to
> start the solr and allocate more memory to the heap.
> 
> My question is, that since this could also become insufficient later, so it
> the issue related to cacheing?
> 
> here is my cache block in solrconfig:
> 
> <filterCache class="solr.FastLRUCache"
>                  size="512"
>                  initialSize="512"
>                  autowarmCount="0"/>
> 
> <queryResultCache class="solr.LRUCache"
>                      size="512"
>                      initialSize="512"
>                      autowarmCount="0"/>
> 
> <documentCache class="solr.LRUCache"
>                    size="512"
>                    initialSize="512"
>                    autowarmCount="0"/>
> 
> I am thinking like maybe I need to turn of the cache for "documentClass".
> Anyone got a better idea? Or perhaps there is another issue here?

Exactly how big is this field?  Do you need this giant field returned
with your results, or is it just there for searching?

Caches of size 512, especially with autowarm disabled, are probably not
a major cause for concern, unless the big field is big enough so that
512 of them is really really huge.  If that's the case, I would reduce
the size of your documentCache, not turn it off.

The value of ramBufferSizeMB elsewhere in your config is more likely to
affect how much RAM gets used during indexing.  The default for this
field as of Solr 4.1.0 is 100.  Most people can reduce this value.

I'm writing a reply to another thread where you are participating, with
info that will likely be useful for you too.  Look for that.

Thanks,
Shawn

Reply via email to