Hi all,
using an acl like
acl wordsex urlpath_regex sex
will most likely annoy your users as a lot of url do contain the word sex but no pornographic content, e.g. www.sextant.com, www.essex.ac.uk
and so on.
IMHO acl with large blacklists may slowdown squid, I prefer squidguard.
We do content-filtering with squidguard with combined sets of web urls (blacklist/whitelist) from the squidguard project, the chastity-list, our own robot AND a selfmade whitelist as all blacklist may contain false positives (e.g. mass hosting services). Redirector service time is not measurable (0 ms) on a 2GHz machine.
Regards, Hendrik
Muthukumar wrote:
I want to block some web sitie of sex,
Do you have fixed set of web urls in that specific kind? If so then to make the task is easy.
for example if any user want to load www.sexy.com, the squid don�t let see this web, I want to use filter expresion regular.
If you don't want to allow the user to access a url which contains a word "sex" then set an acl as
acl wordsex urlpath_regex sex acl blockuser src <ip-addess>/<netmask> http_access deny blockuser wordsex
If you want to block the specific url to a user,
acl sexurl url_regex ^http://www.sexy.com
or
acl sexurl dstdomain .sexy.com
http_access deny user sexurl
It is good to have the set of urls you want to block and the blocking to the specific users or to all.
Regards, Muthukumar.
--- =============== It is a "Virus Free Mail" =============== Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.698 / Virus Database: 455 - Release Date: 6/2/2004
