On Dec 24, 2009, at 1:51 PM, Walter Underwood wrote:

> Some bots will do that, too. Maybe badly written ones, but we saw that at 
> Netflix. It was causing search timeouts just before a peak traffic period, so 
> we set a page limit in the front end, something like 200 pages.
> 
> It makes sense for that to be very slow, because a request for hit 28838540 
> means that Solr has to calculate the relevance for 28838540 + 10 documents.
> 
> Fuad: Why are you benchmarking this? What user is looking at 20M documents? 
> 

20M may be a bit much, but 500K - 1M is not out of the realm for clients that 
do downstream analysis.

Reply via email to