We get at rare times out of memory errors during the day. I know one reason for 
this is data imports, none are going on. I see in the wiki, document adds have 
some quirks, not doing that. I don't know to to expect for memory use though.

We had Solr running under Tomcat set to 2G ram. I presume cache size has an 
effect on memory, that's set to 30,000 for filter, document and queryResult. 
Have experimented with different sizes for a while, these limits are all lower 
than we used to have them set to. So, hoping there no sort of memory leak 
involved.

In any case, some of the messages are:

Exception in thread "http-8080-21" java.lang.OutOfMemoryError: Java heap space


Some look like this:

Exception in thread "http-8080-22" java.lang.NullPointerException
        at 
java.util.concurrent.ConcurrentLinkedQueue.offer(ConcurrentLinkedQueue.java:273)
...

I presume the null pointer is a result of being out of memory. 

Should Solr possibly need more than 2GB? What else can we tune that might 
reduce memory usage?

Reply via email to