While your solution is feasible it would still consume processor and memory because you are doing this at a very high level, you will
be better of solving this at a lower level by a proper use of a firewall. What you have described sounds like a kiddie script attempt at a denial of service or brute force cracking.
Andre Dubuc wrote:
Hi,
Recently, a 'user' attempted to access a restricted area of my site repetitively (spanning five hours) entering the same url repetitively [probably by script]. A massive log file was generated. I would like to ban such behavior by limiting the number of successive 'get's a user can do (say 4 attempts) before an appropriate action is taken..
As a temporary measure (until I can figure a better way) the url in question was disabled.
What I'd like to do, on a per-file basis using $_SESSION, is a combination of ipaddress perhaps with a counter that records the number of times that file was accessed, and limit the number of successive 'get's that can be done before the file is no longer accessible.
In a script that checks for bad words, I have used:
<?php
if ($_SESSION['text'] = "badwords"){
$_SESSION['attempt'] = 1; header("location: unwanted.php");
}
[In the file unwanted.php I checked for $_SESSION['attempt'] = 1 and booted if the condition was met]
However, using this approach I cannot augment this number without resorting to a file get/put schema. Is there a way around this? Is there a better approach?
I've tried .htaccess but the user in question has a dynamic address.
Any help appreciated. Tia, Andre
-- Raditha Dissanayake. ------------------------------------------------------------------------ http://www.radinks.com/sftp/ | http://www.raditha.com/megaupload Lean and mean Secure FTP applet with | Mega Upload - PHP file uploader Graphical User Inteface. Just 150 KB | with progress bar.
-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php