I have been running load testing using JMeter on a Solr 1.4 index with ~4
million docs. I notice a steady JVM heap size increase as I iterator 100
query terms a number of times against the index. The GC does not seems to
claim the heap after the test run is completed. It will run into OutOfMemory
as I repeat the test or increase the number of threads/users. 

The date facet queries are specified as following (as part of "append"
section in request handler):
    <lst name="appends">
    <str name="facet.query">{!ex=last_modified}last_modified:[NOW-30DAY TO
*]</str>
     <str name="facet.query">{!ex=last_modified}last_modified:[NOW-90DAY TO
NOW-30DAY]</str>
     <str name="facet.query">{!ex=last_modified}last_modified:[NOW-180DAY TO
NOW-90DAY]</str>
     <str name="facet.query">{!ex=last_modified}last_modified:[NOW-365DAY TO
NOW-180DAY]</str>
     <str name="facet.query">{!ex=last_modified}last_modified:[NOW-730DAY TO
NOW-365DAY]</str>
     <str name="facet.query">{!ex=last_modified}last_modified:[* TO
NOW-730DAY]</str>
    </lst>

The last_modified field is a TrieDateField with a precisionStep of 6.

I have played for filterCache setting but does not have any effects as the
date field cache seems be  managed by Lucene FieldCahce.

Please help as I can be struggling with this for days. Thanks in advance.
-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Date-faceting-and-memory-leaks-tp824372p824372.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to