On 3/4/2014 2:23 AM, Angel Tchorbadjiiski wrote:
in the last couple of weeks one of my machines is experiencing OutOfMemoryError: Java heap space errors. In a couple of hours after starting the SOLR instance queries with execution times of unter 100ms need more than 10s to execute and many Java heap space erros appear in the logs. I would be grateful for any hints, how to localize/fix the problem.


The system is a single shard SOLR 4.6.1 instance with two cores in the default jetty container. The CORE1 has ~45M documents (disk size of 32GB) with 40 fields (all stored and indexed). CORE2 has ~20M documents (disk size of 18GB) with 60 fields (both stored and indexed also). All the fields are relatively short with a maximal length of 100 characters. In both cores 20 of the fields are used for faceting, have a cache populating query on "newSearcher" event and the following cache configurations:

It may be your facets that are killing you here. As Toke mentioned, you have not indicated what your max heap is.20 separate facet fields with millions of documents will use a lot of fieldcache memory if you use the standard facet.method, fc.

Try adding facet.method=enum to all your facet queries, or you can put it in the defaults section of each request handler definition. Alternatively, you can add docValues to all your facet fields, but that will require a full reindex.

http://wiki.apache.org/solr/SolrPerformanceProblems#Java_Heap

Thanks,
Shawn

Reply via email to