We are using Solr out of the box, with only a couple of changes in the
solconfig file.

We are running a MapReduce job to import into Solr. Every map creates one
document and used to
add and commit it to Solr. We got org.apache.solr.common.SolrException:
Error_opening_new_searcher_exceeded_limit_of_maxWarmingSearchers4_try_again_later,
which we solved by removing the commit statment from the MR job and added
auto-commit in solrconfig.

We reran the job and got another exception: java.io.IOException: cannot read
directory 
org.apache.lucene.store.FSDirectory@/home/solr/src/apache-solr-nightly/example/solr/data/index:
list() returned null
followed by: SEVERE: org.apache.lucene.store.LockObtainFailedException: Lock
obtain timed out: SingleInstanceLock: write.lock

This was happening when the number of mappers writing to solr was 6, we
lowered the number of inputters to 3 and everything worked fine.

Does anyone know what happens, and how we can use more than 3 input sources
at the same time?

Regards Erik

Reply via email to