Re: Core failure when a lot of processes are indexing

2014-05-06 Thread Hakim Benoudjit
Thanks Erick for the explanation. I'll set my autocommit max time to 30 seconds then. But, I can let soft commit max time to 1/4 hour since it's an ads plateform which needs to be updated regularly. 2014-05-05 21:14 GMT+01:00 Erick Erickson : > Take a look through the article I linked, 5 minutes

Re: Core failure when a lot of processes are indexing

2014-05-05 Thread Erick Erickson
Take a look through the article I linked, 5 minutes may be an issue since the transaction log will hold all 5 minutes worth of input. In batch processes this can be quite a bit of data. Worse, when a Solr instance terminates unexpectedly, the entire transaction log can be replayed. Consider settin

Re: Core failure when a lot of processes are indexing

2014-05-05 Thread Hakim Benoudjit
I've tried it & it worked by letting solr do the commit instead of my solr client. In solrconfig.xml: autocommit max_time has been set to 5 minutes & autosoftcommit max_time to something bigger. Thanks a lot guys! 2014-05-05 16:30 GMT+01:00 Erick Erickson : > You should not be committing from t

Re: Core failure when a lot of processes are indexing

2014-05-05 Thread Erick Erickson
You should not be committing from the client by and large, use the and options in solrconfig.xml. See: http://searchhub.org/2013/08/23/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/ Best, Erick On Mon, May 5, 2014 at 8:12 AM, Hakim Benoudjit wrote: > Is there an option in

Re: Core failure when a lot of processes are indexing

2014-05-05 Thread Hakim Benoudjit
Is there an option in Solr (solrconfig.xml or somewhere else) to regularize commits to the index. I meant to do a 'sleep' between each commit to the index, when data to-be-indexed is waiting inside a stack. 2014-05-05 15:58 GMT+01:00 Hakim Benoudjit : > The index is made with the same version of

Re: Core failure when a lot of processes are indexing

2014-05-05 Thread Hakim Benoudjit
The index is made with the same version of solr, that is searching (4.6.0), the config file (solrconfig.xml) & schema.xml is the same too. The only way for me to solve this issue is to let only one process to index at the same time. Wouldnt a layer of message queue resolve this issue? 2014-05-04

Re: Core failure when a lot of processes are indexing

2014-05-04 Thread Shawn Heisey
On 5/4/2014 9:30 AM, Hakim Benoudjit wrote: > Ok. These files contain what you've requested: > > First (the xml error): http://pastebin.com/ZcagK3T7 > Second (java params): http://pastebin.com/JtWQpp6s > Third (Solr version): http://pastebin.com/wYdpdsAW Are you running with an index originally b

Re: Core failure when a lot of processes are indexing

2014-05-04 Thread Hakim Benoudjit
Ok. These files contain what you've requested: First (the xml error): http://pastebin.com/ZcagK3T7 Second (java params): http://pastebin.com/JtWQpp6s Third (Solr version): http://pastebin.com/wYdpdsAW 2014-05-04 14:23 GMT+01:00 Shawn Heisey : > On 5/4/2014 6:06 AM, Hakim Benoudjit wrote: > > I

Re: Core failure when a lot of processes are indexing

2014-05-04 Thread Shawn Heisey
On 5/4/2014 6:06 AM, Hakim Benoudjit wrote: > I have a lot of scripts running *concurrently *& indexing on the same index > (*collection*). This throws the following error: > > *SolrCore 'collection2' is not available due to init failure: Error opening > new searcherorg.apache.solr.common.SolrExce