On 2016-07-14 8:07 PM, Mohd Zainal Abidin wrote: > > How to block this kind of attack? > > 27.111.213.117 - - [15/Jul/2016:10:03:32 +0800] "GET /2014/07/ HTTP/1.1" > 200 70977 "-" "Mozilla/4.0 (compatible;)" > 27.111.213.117 - - [15/Jul/2016:10:03:27 +0800] "GET /2007/05/ HTTP/1.1" > 200 62797 "-" "Mozilla/4.0 (compatible;)" > 27.111.213.117 - - [15/Jul/2016:10:03:33 +0800] "GET /2014/06/ HTTP/1.1" > 200 72461 "-" "Mozilla/4.0 (compatible;)" ... > We getting this kind of attack from different ip last night. Our website > load goes to 100 and it become slow to response. >
What would you propose the fail2ban rules be? What makes this access pattern something to ban and how is it different than valid usage? Are those paths invalid? Your web server says they are 200 OK. Is "Mozilla/4.0 (compatible;)" not a legit user agent? Are they requesting too fast and you want to rate limit it? I don't see a bunch of failed authentication attempts or requests for some known PHP or IIS path to exploit. I see a remote system downloading a bunch of valid (200) URLs as fast as it can (mirroring/websuck? archiving?). While it may feel like a client sucking your bandwidth (or CPU if those paths are dynamically generated) dry is an attack, I'd be hesitant to call it an attack, definitely not a flood one, and I wouldn't turn to fail2ban first for protection. I think fail2ban is a great tool, but it's a reactive log scanning tool that looks for patterns of badness. For proactive solutions I look to the attacked service or firewall rules. For example if the issue is how much and how fast, look into some apache module like mod_ratelimit, mod_security, mod_evasive, mod_limitipconn, mod_qos, or one of many others. You'd have to look at them and decide which meets your needs. You could also configure apache to ban based on user agent. You can use iptables rules to rate limit new connections or ban a specific IP outright for being a bad actor. If you have iptable rules for rate limiting and apache rules to limit how many requests can be made in one connection (max keep alive requests) you will give fail2ban a little breathing room. Back to your load issue. If /2006/12/ isn't dynamically generated I'd take a hard look at why requesting it is causing so much load. Static content should be out the door with little load. Maybe your MaxClients and other settings are eating up all your memory and your bogged down swapping. Maybe /2006/12/ is dynamic and if this were valid traffic you would be looking into caching. Even if you fixed the load issue though, a bad actor can suck up MaxClients so that your website isn't available to other users without some kind of protection against that. Often just banning the bad actor(s) for a while is enough to give them the message to go away but an apache module designed for this kind of abuse protection would help with the current issue and be ready for the next time. Best of luck, -- Jacob Anawalt Gecko Software, Inc. [email protected] 435-752-8026 ------------------------------------------------------------------------------ What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic patterns at an interface-level. Reveals which users, apps, and protocols are consuming the most bandwidth. Provides multi-vendor support for NetFlow, J-Flow, sFlow and other flows. Make informed decisions using capacity planning reports.http://sdm.link/zohodev2dev _______________________________________________ Fail2ban-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/fail2ban-users
