Hi,

We are having a TooManyOpenFiles exception in our indexing process. We
are reading data from a database and indexing this data into one of
the two cores of our solr instance. Each of the cores has a different
schema as they are used for a different purpose. While we index in the
first core, we do many searches in the second core as it contains data
to "enrich" what we index (the second core is never modifier - read
only). After indexing about 50.000 documents (about 300 fields each)
we get the exception. If we run the same process, but without the
"enrichment" (not doing queries in the second core), everything goes
all right.
We are using spring batch, and we only commit+optimize at the very
end, as we don't need to search anything in the data that is being
indexed.

I have seen recommendations that go from committing+optimize more
often or lowering the merge factor? How is the merge factor affecting
in this scenario?

Thanks,

Bruno

Reply via email to