1> I don't know, where is it coming from? Looks like you've done stats call on
a freshly opened server.
2> 512 entries (i.e. results for 512 queries). Each entry is
doc IDs.
Best
Erick
On Fri, Aug 19, 2011 at 5:33 AM, jame vaalet wrote:
> 1 .what does this specify ?
>
> size="*${queryResultCa
1 .what does this specify ?
2.
when i say *queryResultCacheSize : 512 *, does it mean 512 queries can be
cached or 512 bytes are reserved for caching ?
can some please give me an answer ?
On 14 August 2011 21:41, Erick Erickson wrote:
> Yep.
>
> ResultWindowSize in
> >> solrconfig.xml
> >
Yep.
ResultWindowSize in
>> solrconfig.xml
>>
>> Best
>> Erick
>>
>> On Sun, Aug 14, 2011 at 8:35 AM, jame vaalet wrote:
>> > thanks erick ... that means it depends upon the memory allocated to the
>> JVM
>> > .
>> >
>> > going back queryCacheResults factor i have got this doubt ..
>> > say, i ha
my queryResultCache size =0 and queryResultWindowSize =50
does this mean that am not caching any results ?
On 14 August 2011 18:27, Erick Erickson wrote:
> As many results will be cached as you ask. See solrconfig.xml,
> the queryResultCache. This cache is essentially a map of queries
> and res
As many results will be cached as you ask. See solrconfig.xml,
the queryResultCache. This cache is essentially a map of queries
and result document IDs. The number of doc IDs cached for
each query is controlled by queryResultWindowSize in
solrconfig.xml
Best
Erick
On Sun, Aug 14, 2011 at 8:35 AM,
thanks erick ... that means it depends upon the memory allocated to the JVM
.
going back queryCacheResults factor i have got this doubt ..
say, i have got 10 threads with 10 different queries ..and each of them in
parallel are searching the same index with millions of docs in it
(multisharded ) .
There isn't an "optimum" page size that I know of, it'll vary with lots of
stuff, not the least of which is whatever servlet container limits there are.
But I suspect you can get quite a few (1000s) without
too much problem, and you can always use the JSON response
writer to pack in more pages wit
speaking about pagesizes, what is the optimum page size that should be
retrieved each time ??
i understand it depends upon the data you are fetching back fromeach hit
document ... but lets say when ever a document is hit am fetching back 100
bytes worth data from each of those docs in indexes (alon
Jame:
You control the number via settings in solrconfig.xml, so it's
up to you.
Jonathan:
Hmmm, that's seems right, after all the "deep paging" penalty is really
about keeping a large sorted array in memory but at least you only
pay it once per 10,000, rather than 100 times (assuming page siz
when you say queryResultCache, does it only cache n number of result for the
last one query or more than one queries?
On 10 August 2011 20:14, simon wrote:
> Worth remembering there are some performance penalties with deep
> paging, if you use the page-by-page approach. may not be too much of a
: solr-user@lucene.apache.org
Subject: Re: paging size in SOLR
Worth remembering there are some performance penalties with deep
paging, if you use the page-by-page approach. may not be too much of a
problem if you really are only looking to retrieve 10K docs.
-Simon
On Wed, Aug 10, 2011 at 10:32 AM
Worth remembering there are some performance penalties with deep
paging, if you use the page-by-page approach. may not be too much of a
problem if you really are only looking to retrieve 10K docs.
-Simon
On Wed, Aug 10, 2011 at 10:32 AM, Erick Erickson
wrote:
> Well, if you really want to you ca
Well, if you really want to you can specify start=0 and rows=1 and
get them all back at once.
You can do page-by-page by incrementing the "start" parameter as you
indicated.
You can keep from re-executing the search by setting your queryResultCache
appropriately, but this affects all searches
13 matches
Mail list logo