Setting:
<mergeFactor>2</mergeFactor>

may help. have you tried it? Indexing will be a bit slower but will be
faster optimizing.
You can check with lsof to see how many files jetty/tomcat (or the server
you are using) is holding


Bruno Aranda wrote:
> 
> Hi,
> 
> We are having a TooManyOpenFiles exception in our indexing process. We
> are reading data from a database and indexing this data into one of
> the two cores of our solr instance. Each of the cores has a different
> schema as they are used for a different purpose. While we index in the
> first core, we do many searches in the second core as it contains data
> to "enrich" what we index (the second core is never modifier - read
> only). After indexing about 50.000 documents (about 300 fields each)
> we get the exception. If we run the same process, but without the
> "enrichment" (not doing queries in the second core), everything goes
> all right.
> We are using spring batch, and we only commit+optimize at the very
> end, as we don't need to search anything in the data that is being
> indexed.
> 
> I have seen recommendations that go from committing+optimize more
> often or lowering the merge factor? How is the merge factor affecting
> in this scenario?
> 
> Thanks,
> 
> Bruno
> 
> 

-- 
View this message in context: 
http://www.nabble.com/TooManyOpenFiles%3A-indexing-in-one-core%2C-doing-many-searches-at-the--same-time-in-another-tp24478812p24479144.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to