On Aug 11, 2005, at 4:06 PM, Evert | Collab wrote:

First hit on google:
http://www.searchengineworld.com/robots/robots_tutorial.htm
Search engines check for a robots.txt on your site, in the robots.txt file you can specify that certain or all search engines shouldn't index your site

I know what robots.txt is, I meant how would you use that to cloak the site. Put PHP code in robots.txt to log the IP of any requests to a db, and then use that db to cloak the rest of the site or not?

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to