I would imagine the performance penalties with deep paging will ALSO be there 
if you just ask for 10000 rows all at once though, instead of in, say, 100 row 
paged batches. Yes? No?

-----Original Message-----
From: simon [mailto:mtnes...@gmail.com] 
Sent: Wednesday, August 10, 2011 10:44 AM
To: solr-user@lucene.apache.org
Subject: Re: paging size in SOLR

Worth remembering there are some performance penalties with deep
paging, if you use the page-by-page approach. may not be too much of a
problem if you really are only looking to retrieve 10K docs.

-Simon

On Wed, Aug 10, 2011 at 10:32 AM, Erick Erickson
<erickerick...@gmail.com> wrote:
> Well, if you really want to you can specify start=0 and rows=10000 and
> get them all back at once.
>
> You can do page-by-page by incrementing the "start" parameter as you
> indicated.
>
> You can keep from re-executing the search by setting your queryResultCache
> appropriately, but this affects all searches so might be an issue.
>
> Best
> Erick
>
> On Wed, Aug 10, 2011 at 9:09 AM, jame vaalet <jamevaa...@gmail.com> wrote:
>> hi,
>> i want to retrieve all the data from solr (say 10,000 ids ) and my page size
>> is 1000 .
>> how do i get back the data (pages) one after other ?do i have to increment
>> the "start" value each time by the page size from 0 and do the iteration ?
>> In this case am i querying the index 10 time instead of one or after first
>> query the result will be cached somewhere for the subsequent pages ?
>>
>>
>> JAME VAALET
>>
>

Reply via email to