---- tedd <[EMAIL PROTECTED]> wrote: 
> Hi gang:
> 
> How do you spider a remote web site in php?
> 
> I get the general idea, which is to take the root page, strip out the 
> links and repeat the process on those links. But, what's the code? 
> Does anyone have an example they can share or a direction for me to 
> take?
> 
> Also, is there a way to spider through a remote web site gathering 
> directory permissions?
> 
> I know there are applications, such as Site-sucker, that will travel 
> a remote web site looking for anything that it can download and if 
> found, do so. But is there a way to determine what the permissions 
> are for those directories?
> 
> If not, can one attempt to write a file and record the 
> failures/successes (0777 directories)?
> 
> What I am trying to do is to develop a way to test if a web site is 
> secure or not. I'm not trying to develop evil code, but if it can be 
> done then I want to know how.
> 
> Thanks and Cheers,
> 
> tedd
> 
> -- 
> -------
> http://sperling.com  http://ancientstones.com  http://earthstones.com

In one word:  CURL

In another word: WGET

Both are pretty effecitve and give pretty much the same results, however with 
the CURL you can pass other things alone (user:pass) which with wget you can 
not do.

HTH,
Wolf

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to