Hi,

I am getting too many open files error.

Usually I test on a server that has 4GB RAM and assigned 1GB for
tomcat(set JAVA_OPTS=-Xms256m -Xmx1024m), ulimit -n is 256 for this
server and has following setting for SolrConfig.xml

 

    <useCompoundFile>true</useCompoundFile>

    <ramBufferSizeMB>1024</ramBufferSizeMB>

    <mergeFactor>100</mergeFactor>

    <maxMergeDocs>2147483647</maxMergeDocs>

    <maxFieldLength>10000</maxFieldLength>

 

In my case 200,000 documents is of 1024MB size and in this testing, I am
indexing total of million documents. We have high setting because we are
expected to index about 10+ million records in production. It works fine
in this server. 

 

When I deploy same solr configuration on a server with 32GB RAM, I get
"too many open files" error. The ulimit -n is 1024 for this server. Any
idea? Is this because 2nd server has 32GB RAM? Is 1024 open files limit
too low? Also I don't find any documentation for     <ramBufferSizeMB>.
I checked Solr 'Solr 1.4 Enterprise Search Server' book, wiki, etc. I am
using Solr 1.3.

 

Is it good idea to use ramBufferSizeMB? Vs maxBufferedDocs?  What does
ramBufferSizeMB mean? My understanding is that when documents added to
index which are initially stored in memory reaches size
1024MB(ramBufferSizeMB), it flushes data to disk. Or is it when total
memory used(by tomcat, etc) reaches 1024, it flushed data to disk?

 

Thanks,

Sharmila

 

 

 

 

 

 

 

Reply via email to