On Sat, Aug 16, 2003 at 05:42:23PM -0400, Reaz Baksh wrote: > Hell > > Can some one help me with some information? > > > > I'm looking for a script that will allow me to get web pages, and then > copy some of the information from them, then create a new page with the > information I pulled. > > > > I saw one that used, I think, RSS but I lost it.
You might want to take a look at wget. It will recursively fetch whole pages for you including links, images, whatever, any way you like. Then you can edit the source by hand or maybe try amaya, a browser/editor (that doesn't support crappy code). I'm not sure if this is what you're looking for, though. David -- Happy Birthday, Debian! August 16, 1993 http://www.linuxplanet.com/linuxplanet/print/4959/ -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]