Thanks! We enlarged the max heap size and it looks ok so far.
On Fri, Apr 9, 2010 at 4:23 AM, Lance Norskog wrote:
> Since the facet "cache" is hard-allocated and has not eviction policy,
> you could do a facet query on each core as part of the wam-up. This
> way, the facets will not fail. At t
Since the facet "cache" is hard-allocated and has not eviction policy,
you could do a facet query on each core as part of the wam-up. This
way, the facets will not fail. At that point, you can tune the Solr
cache sizes.
Solr caches documents, searches, and filter queries. Filter queries
are sets
I noticed now that the OutOfMemory exception occurs upon faceting queries.
Queries without facets do return successfully.
There are two log types upon the exception. The queries causing them differ
only in q parameter, the faceting and sorting parameters are the same. I
guess this has something to
The queries do require sorting (on int) and faceting. They should fetch
first 200 docs.
The current problematic core has 10 entries in fieldCache and 5 entries in
filterCache. The other caches are empty. Is there any way to know how much
memory specific cache takes?
The problem is that one core b
Sorting takes memory. What data types are the fields sorted on? If
they're strings, that could be a space-eater. If they are ints or
dates, not a problem.
Do the queries pull all of the documents found? Or do they just fetch
the, for example, first 10 documents?
What are the cache statistics like
Hi,
We are using Solr 1.4 running 2 cores each containing ~90M documents. Each
core index size on the disk is ~ 120 G.
The machine is a 64-bit quad-core 64G RAM running Windows Server 2008.
Max heap size is set to 9G for the Tomcat process. Default caches are used.
Our queries are complex and in