Hi Grant

I have a number of sites that are quite image heavy and employ a number of 
plugins so during my testing of rate limiting I actually kept finding myself 
being limited. This happened especially when working on a wordpress site in the 
backend and then testing it on the front end which normally involves refreshing 
just one page over and over until you get a CSS change or something the way you 
want it. 

So over a period of hours I arrived at the rates I have below which never block 
me out and allow me or anyone else to author away and test their wordpress site 
as much as they want without ever being limited. Same applies to multiple 
people perhaps viewing your site through a proxy / single IP.

While my rates may seem high they work exceptionally well for wordpress sites 
and it does really block bad bots out immediately. 

I do have another level of blocking bots and referers in Nginx using a 
different rate limiting zone. That script you can use to rate limit specific 
bots and user agents ie. a search engine like Baidu / Yandex which you want to 
allow to index your sites but you don’t want it to go crazy. You can check this 
script out here - 
https://github.com/mitchellkrogza/nginx-ultimate-bad-bot-blocker (been working 
on that for 4 months, built from the ground up and only released the first 
public version just a few days ago. You will see from the commits that it is 
updated almost daily)

The rate limiting zones you and I have been discussing here are all included in 
that script.

I also do a daily  "grep -E ‘limiting requests' /var/log/nginx/*.log to keep a 
check on things and haven’t had one true visitor blocked out yet only the 
naughty one’s.

KR
Mitchell




From: Grant <[email protected]>
Date: 12 September 2016 at 10:32:30 PM
To: Mitchell Krog Photography <[email protected]>
Cc: [email protected] <[email protected]>
Subject:  Re: [Fail2ban-users] nginx-limit-req config  

> IN SITE CONF FILE:  
> -----------------------------  
> limit_conn addr 200;  
> limit_req zone=flood burst=200 nodelay;  
>  
> IN NGINX CONF FILE:  
> ———————————————  
> limit_req_zone $ratelimited zone=flood:50m rate=90r/s  
> limit_conn_zone $ratelimited zone=addr:50m;  


I see that your limit_req and limit_conn config matches. That makes  
sense to me. Why would someone configure them differently?  

Your allowed rate and burst values are much higher than mine. Is this  
because each page request on your site includes a lot of extraneous  
requests (images, etc)? I'm concerned that my low limits work fine  
for when there is a 1:1 correlation between IP addresses and humans,  
but won't when multiple people are simultaneously browsing my site  
behind a single IP.  

- Grant  
------------------------------------------------------------------------------
_______________________________________________
Fail2ban-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/fail2ban-users

Reply via email to