Yes , I am using TIKA for content extraction. The xlsx file size is 25MB. IS there any other option to relsolve the OOM issue rather than increasing the RAM.
Can we chnage some other configuration param of solr to avoid OOM issue? Are you using Tika to do the extraction of content? You might be getting OOM because of huge xlsx file. Try having bigger RAM and you might not get the issue. On Mon, Sep 12, 2011 at 12:44 PM, abhijit bashetti <abhijitbashe...@gmail.com> wrote: > Hi, > > I am getting the OOM error. > > I am working with multi-core for solr . I am using DIH for indexing. I have > also integrated TIKA for content extraction. > > I am using ORACLE 10g DB. > > In the solrconfig.xml , I have added > > <filterCache class="solr.FastLRUCache" > size="512" > initialSize="512" > autowarmCount="0"/> > > <queryResultCache class="solr.LRUCache" > size="512" > initialSize="512" > autowarmCount="0"/> > > <documentCache class="solr.LRUCache" > size="512" > initialSize="512" > autowarmCount="0"/> > > > <lockType>native</lockType> > > > My indexing server is on linux with 8GB of ram. > I am indexing huge document set. 10 cores are there. every core has 300 000 > documents. > > I got the OOM error for a xlsx document which is of 25MB size. > > On the Indexing server , I am doing indexing (first time indexing for a new > core added) , re-indexing and searching also. > > Do I need to create multiple solr webapps to resolve the issues. > > Or I need add more RAM to the system so as to avoid OOM. > > Regards, > Abhijit