On Mon, Feb 14, 2000 at 10:20:03AM -0600, Zaigui Wang wrote:
>
> I would like to use a tool that can go through an entire web site and
> figure out those broken links. Anybody has a recommendation?
wget is able to do this, using the "--spider" option:
`--spider'
When invoked with this option, Wget will behave as a Web "spider",
which means that it will not download the pages, just check that
they are there. You can use it to check your bookmarks, e.g. with:
wget --spider --force-html -i bookmarks.html
This feature needs much more work for Wget to get close to the
functionality of real WWW spiders.
(from the wget texinfo documentation).
You'll have to experiment a bit with other options (e.g. recursive) to
check the whole site, I suppose.
HTH,
Thomas
--
"Look, Ma, no obsolete quotes and plain text only!"
Thomas Ribbrock | http://www.bigfoot.com/~kaytan | ICQ#: 15839919
"You have to live on the edge of reality - to make your dreams come true!"
--
To unsubscribe: mail [EMAIL PROTECTED] with "unsubscribe"
as the Subject.