What exactly is the common practice - is there a free, downloadable search component that does that or at least a "blueprint" for "recommended best practice"? What limit is common? (I know Google limits you to the top 1,000 results.)

-- Jack Krupansky

-----Original Message----- From: Otis Gospodnetic
Sent: Saturday, December 08, 2012 7:25 AM
To: solr-user@lucene.apache.org
Subject: Re: star searches with high page number requests taking long times

Hi Robert,

You should just prevent deep paging. Humans with wallets don't do that, so
you will not lose anything by doing that. It's common practice.

Otis
--
SOLR Performance Monitoring - http://sematext.com/spm
On Dec 7, 2012 8:10 PM, "Petersen, Robert" <rober...@buy.com> wrote:

Hi guys,


Sometimes we get a bot crawling our search function on our retail web
site.  The ebay crawler loves to do this (Request.UserAgent: Terapeakbot).
They just do a star search and then iterate through page after page. I've
noticed that when they get to higher page numbers like page 9000, the
searches are taking more than 20 seconds.  Is this expected behavior?
 We're requesting standard facets with the search as well as incorporating
boosting by function query.  Our index is almost 15 million docs now and
we're on Solr 3.6.1, this isn't causing any errors to occur at the solr
layer but our web layer times out the search after 20 seconds and logs the
exception.



Thanks

Robi



Reply via email to