I'm indexing a set of 500000 small documents. I'm adding documents in batches of 1000. At the beginning I had a setup that optimized the index each 10000 documents, but quickly I had to optimize after adding each batch of documents. Unfortunately, I'm still getting the "Too many open files" IO error on optimize. I went from mergeFactor of 25 down to 10, but I'm still unable to optimize the index.

I have configuration:
    <useCompoundFile>false</useCompoundFile>
    <ramBufferSizeMB>256</ramBufferSizeMB>
    <mergeFactor>2</mergeFactor>
    <maxMergeDocs>2147483647</maxMergeDocs>
    <maxFieldLength>10000</maxFieldLength>

The machine (2 core AMD64, 4GB RAM) is running Debian Linux, Java is 1.6.0_11 64-Bit, Solr is nightly build (2009-04-02). And no, I can not change the limit of file descriptors (currently: 1024). What more can I do?

--
We read Knuth so you don't have to. - Tim Peters

Jarek Zgoda, R&D, Redefine
jarek.zg...@redefine.pl

Reply via email to