Many people make big mistakes with rate limiting on Nginx and end up blocking themselves and legitimate users. Most of the time it is because the rate they set is far too low for a Wordpress or other CMS type of web site.
That rate rule you have below is recommended on all sorts of tutorials and guides telling you how to configure Nginx, the problem is that not one of those geniuses that has written those tutorials has ever tested that rate as they simply copy and pasted that “example” directly from the Nginx docs. If they had actually tested it they would quickly realize a rate like that will ensure you get no visitors to your sites unless all your sites are running simple html with few images. Just one person requesting a wordpress web site from your server, just the front page, can tally up 50+ “requests" for images, resources, style sheets, plugin files etc. Then that person starts clicking on 2-5 pages within your site and he will get blocked with the rate you have below. Do you have any idea how many request are made to open one page on wordpress? and if you have 20 plugins installed with that installation of wordpress ?? and then you have 25 images on your frontpage ??? do the math. You must test things like this thoroughly and monitor your logs extensively. Finding this sweet spot took me a matter of a dedicated few hours and then a few more days to make sure it was working 100%. These rate rules below: limit_req_zone $ratelimited zone=flood:50m rate=90r/s limit_conn_zone $ratelimited zone=addr:50m; have been thoroughly tested and monitored over several months, it is fail proof for wordpress sites and the moment a true DDOS attempt comes along it kicks in and does it’s job. I have this rate limiting rule in effect on a live server running 37 wordpress sites, one of them a very busy site that gets 60,000 - 80,000 visitors a day and not one gets rate limited … only once a week some bot or something comes along and starts hammering a site and immediately I can see it in my Munin stats. But Nginx just keeps running happily and just rate limits the twat. Normal visitors, crawlers and spiders are not affected at all. Also make sure you have an effective white list rule for the rate limiting where your own server IP is specifically excluded from any rate limiting because if not you will quickly find yourself in hot water with your own IP being rate limited … Nginx just does what it is told to do. KR Mitchell From: Grant <[email protected]> Date: 08 September 2016 at 11:13:48 PM To: [email protected] <[email protected]> Subject: [Fail2ban-users] nginx-limit-req config I set up limit-req on nginx but I ended up dropping requests from a lot of legitimate users although very few were banned. I was using this config which is shown in the example in the config file distributed by Gentoo with fail2ban: limit_req_zone $binary_remote_addr zone=one:10m rate=1r/s; limit_req zone=one burst=1 nodelay; I'm going to try increasing the burst value to 5 but has anyone found a config for this that seems to work well? I'm only limiting requests for pages. - Grant ------------------------------------------------------------------------------ _______________________________________________ Fail2ban-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/fail2ban-users
------------------------------------------------------------------------------
_______________________________________________ Fail2ban-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/fail2ban-users
