This search has up to 8000 records. Does this require a query cache of
8000 records? When is the query cache filled?

This answers a second question: the filter design is intended for small
search sets. I'm interested in selecting maybe 1/10 of a few million
records as a search limiter. Is it possible to create a similar feature
that caches low-level data areas for aquery? Let's say that the if query
selects 1/10 of the document space, this means that only 40% of the
total memory area contains data for that 1/10. Is there a cheap way to
record this data? Would it be a feature like filters which records a
much lower-level data structure like disk blocks?

Thanks,

Lance Norskog

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Yonik
Seeley
Sent: Wednesday, October 24, 2007 8:24 PM
To: solr-user@lucene.apache.org
Subject: Re: My filters are not used

On 10/24/07, Norskog, Lance <[EMAIL PROTECTED]> wrote:
> I am creating a filter that is never used. Here is the query sequence:
>
> q=*:*&fq=contentid:00*&start=0&rows=200
>
> q=*:*&fq=contentid:00*&start=200&rows=200
>
> q=*:*&fq=contentid:00*&start=400&rows=200
>
> q=*:*&fq=contentid:00*&start=600&rows=200
>
> q=*:*&fq=contentid:00*&start=700&rows=200
>
> Accd' to the statistics here is my filter cache usage:
>
> lookups : 1
[...]
>
> I'm completely confused. I thought this should be 1 insert, 4 lookups,

> 4 hits, and a hitratio of 100%.

Solr has a query cache too... the query cache is checked, there's a hit,
and the query process is short circuited.

-Yonik

Reply via email to