I am using apache solr 3.5.0
I have a index size of 4.2 G on disk. I have about 30 M records of small
string and numeric fields
categ | name | age | sex | addr | balance | min balance | interest | tax |
customertype
I am running a solr server with
jdk1.6.0_21_64/bin/java -Xms512m -Xmx2048M -jar start.jar
I am running a query :
String queryString = "categ:000007"
SolrQuery query = new SolrQuery();
query.setQuery(queryString);
query.setRows(0);
query.setParam("stats","on");
query.setParam("stats.facet","customertype");
query.setParam("stats.field", "balance");
System.out.println (" running query " + query + " with
stats for balance" );
QueryResponse rsp = server.query(query);
Map<String, FieldStatsInfo> statsInfoMap =
rsp.getFieldStatsInfo();
FieldStatsInfo stats = statsInfoMap.get(fieldName);
System.out.println ("sum " + stats.getSum() + " for " +
stats.getCount() + " records." );
The following are the settings in solrconfig.xml
<filterCache class="solr.FastLRUCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<queryResultCache class="solr.LRUCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<documentCache class="solr.LRUCache"
size="512"
initialSize="512"
autowarmCount="0"/>
The server throws a OOM and works only when I have a Xmx of > 4.2 G. The
solr server seems to loads the entire index into memory. There are only 40
records matching categ:000007 . What am I doing wrong ?
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-loads-entire-Index-into-Memory-tp3829178p3829178.html
Sent from the Solr - User mailing list archive at Nabble.com.