the finished
document to Solr.
Best
Erick
On Mon, Sep 12, 2011 at 5:42 AM, Manish Bafna wrote:
> Number of cache is definitely going to reduce heap usage.
>
> Can you run those xlsx file separately with Tika and see if you are getting
> OOM issue.
> On Mon, Sep 12, 2011 at 3:09 PM,
Best
Erick
On Mon, Sep 12, 2011 at 5:42 AM, Manish Bafna wrote:
> Number of cache is definitely going to reduce heap usage.
>
> Can you run those xlsx file separately with Tika and see if you are getting
> OOM issue.
>
> On Mon, Sep 12, 2011 at 3:09 PM, abhijit bashetti > wro
Number of cache is definitely going to reduce heap usage.
Can you run those xlsx file separately with Tika and see if you are getting
OOM issue.
On Mon, Sep 12, 2011 at 3:09 PM, abhijit bashetti wrote:
> I am facing the OOM issue.
>
> OTHER than increasing the RAM , Can we chnage s
I am facing the OOM issue.
OTHER than increasing the RAM , Can we chnage some other parameters to
avoid the OOM issue.
such as minimizing the filter cache size , document cache size etc.
Can you suggest me some other option to avoid the OOM issue?
Thanks in advance!
Regards,
Abhijit
Are you using Tika to do the extraction of content?
You might be getting OOM because of huge xlsx file.
Try having bigger RAM and you might not get the issue.
On Mon, Sep 12, 2011 at 12:44 PM, abhijit bashetti <
abhijitbashe...@gmail.com> wrote:
> Hi,
>
> I am getting the OOM error.
>
> I am wor
Hi,
I am getting the OOM error.
I am working with multi-core for solr . I am using DIH for indexing. I have
also integrated TIKA for content extraction.
I am using ORACLE 10g DB.
In the solrconfig.xml , I have added
native
My indexing server is on linux with 8GB of ram.
I am indexing