: Are the two problems related? Looking through the mailing list it seems : that changing the settings for useCompoundFile from false to true could : help but before I do that I would like to understand if there are : undesirable side effects, what isnât this param set to true by : default?
Too Many Open Files can result from lots of different possible reasons: one is that you have soo many indexed fields with norms that the number of files in your index is too big -- that's the use case where useCompoundFile=true can help you -- but it's not set that way be default because it can make searching slower. the other reason why you can have too many open files is if you are getting more concurrent requests then you can handle -- or if the clients initiating those requests aren't closing them properly (sockets count as files too) understanding why you are getting these errors requires that you look at what your hard and soft file limits are (ulimit -aH and ulimit -aS on my system) and what files are in use by Solr when these errors occur (lsof -p _solrpid_). to answer your earlier question, i *think* you may be getting the lock timeout errors because it can't access the lock file, because it can't open any more files ... i'm not 100% sure. -Hoss