Hi,

Without knowing the details I suspect it's just that 1.5GB heap is not enough.  
Yes, sort will use your heap, as will various Solr caches.  As will norms, so 
double-check your schema to make sure you are using field types like string 
where you can, not text, for example.  If you sort by timestamp-like fields, 
reduce its granularity as much as possible.


Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



----- Original Message ----
> From: Willie Wong <[EMAIL PROTECTED]>
> To: solr-user@lucene.apache.org
> Sent: Tuesday, October 21, 2008 9:48:14 PM
> Subject: Out of Memory Errors
> 
> Hello,
> 
> I've been having issues with out of memory errors on searches in Solr. I 
> was wondering if I'm hitting a limit with solr or if I've configured 
> something seriously wrong.
> 
> Solr Setup
> - 3 cores 
> - 3163615 documents each
> - 10 GB size
> - approx 10 fields
> - document sizes vary from a few kb to a few MB
> - no faceting is used however the search query can be fairly complex with 
> 8 or more fields being searched on at once
> 
> Environment:
> - windows 2003
> - 2.8 GHz zeon processor
> - 1.5 GB memory assigned to solr
> - Jetty 6 server
> 
> Once we get to around a few  concurrent users OOM start occuring and Jetty 
> restarts.  Would this just be a case of more memory or are there certain 
> configuration settings that need to be set?  We're using an out of the box 
> Solr 1.3 beta version. 
> 
> A few of the things we considered that might help:
> - Removing sorts on the result sets (result sets are approx 40,000 + 
> documents)
> - Reducing cache sizes such as the queryResultMaxDocsCached setting, 
> document cache, queryResultCache, filterCache, etc
> 
> Am I missing anything else that should be looked at, or is it time to 
> simply increase the memory/start looking at distributing the indexes?  Any 
> help would be much appreciated.
> 
> 
> Regards,
> 
> WW

Reply via email to