Emil Pedersen wrote: > > Joerg Johannes wrote: > > > > Hi list > > > > How do I make wget download images? > > I used > > wget -r -l1 -k > > http://www.somesite.anywhere/the/directory/iwant/index.htm > > This copies all html files that are linked form this index file to my > > local computer. I'm only missing the images (this is important, because > > it is a graphics tutoriyl...) > > Hmm. I've used wget to fetch picture collections as well as other > sites. The line above _should_ work (fetch the images too). Have you > checked that the url's for the images are "real" url's and not cgi links > or similar? wget does not work with these thing. Javascript will also > confuse it, afaik. > There is also some sites that take explicit actions to prevent you > >from saving the images, some of them makes it real hard to do so. The > worst I've seen made me watch all pictures in netscape (opera, ..) and > grep the images from .netscape/cache/* . Bloody hell getting the right > filenames then... > > Good luck, > Emil >
Obviously, wget did not fetch the images, but what I did instead, was loading the site to netscape composer, and saved it locally. Netscape saves the images, too only links between eg. chapter1 <-> chapter2 will not work (well, I can survive this...) Thank you anyway joerg PS.: the mirror option "-m" to wget makes it copy the whole site (www.somesite.anywhere), not only the selected page + level 1 link depth... I killed this process very soon. -- Did you know that if you play a Windows 2000 cd backwards, you will hear the voice of Satan? That's nothing! If you play it forward, it'll install Windows 2000.