I have put a limit in the front end at a couple of sites. Nobody gets more than 
50 pages of results. Show page 50 if they request beyond that.

First got hit by this at Netflix, years ago.

Solr 4 is much better about deep paging, but here at Chegg we got deep paging 
plus a stupid, long query. That was using too much CPU.

Right now, block the IPs. Those are hostile.

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)


> On Sep 21, 2015, at 10:31 AM, Paul Libbrecht <p...@hoplahup.net> wrote:
> 
> Writing a query component would be pretty easy or?
> It would throw an exception if crazy numbers are requested...
> 
> I can provide a simple example of a maven project for a query component.
> 
> Paul
> 
> 
> William Bell wrote:
>> We have some Denial of service attacks on our web site. SOLR threads are
>> going crazy.
>> 
>> Basically someone is hitting start=150000 + and rows=20. The start is crazy
>> large.
>> 
>> And then they jump around. start=150000 then start=213030 etc.
>> 
>> Any ideas for how to stop this besides blocking these IPs?
>> 
>> Sometimes it is Google doing it even though these search results are set
>> with No-index and No-Follow on these pages.
>> 
>> Thoughts? Ideas?
> 

Reply via email to