Issue 3 can be handled if you remember to run script and then do your
complex ftp command and exit out of both at the end. All you need do then
is to edit the file script makes for you and turn it into a short shell
script and make it useable. I use the name surf for my scripts when I
need to do this so a command line interface can work easily enough with a
little forethought and planning.. You can also run wget with the -b
switch and have it download in the background and tail the .wget-log file
every so often to check on progress.
On Sat, 28 Jul 2007, Jeff D wrote:
On Sat, 28 Jul 2007, Douglas Allan Tutty wrote:
I'm on dialup and often access the internet via a slow computer by
sshing into my fast computer (which has the modem).
Right now, if I want to download something like an iso file via ftp
(there being no rsync mirror available), I put the url in a file, e.g.
wget.lst and from a shell run $wget -c -i wget.lst.
I let it run untill we need the phone line or its time to turn off the
computer for the night (don't ask), interrupt it, and then start again
later.
It works reasonably well, but there are some issues:
1. After so many interruptions, often there has been an error creep
in and the md5sum doesn't match. Without rsync, I don't know how to fix
a file that is the correct lenght but doesn't match.
2. It would be nice to have a queue that is persistant over
reboots.
3. It would be nice to have a curses interface like mc that lets me
browse to the correct file, then tag the file for downloading which puts
it into the above queue.
4. Have something that runs from /etc/ppp/ip-up.d to start the
download but only use spare bandwidth, and something in ip-down.d to
stop the download.
Issues 2, 3, and 4 are only simple programming and I wonder if such an
app exists (I couldn't find anything in aptitude, most are for X).
Issue 1 is tricky. I haven't come across any ftp client that claims to
do it so it may not be possible given the nature of ftp itself.
Issue 3 is important. A straight CLI doesn't help if I have to write
down what command I issued to get a file and then retype it every time I
start the computer again; may as well stick with wget.
Any ideas? How do others handle ftp downloads that may take a week of
phone time?
Thanks,
Doug.
Hm, well with a little bit of scripting there's ncftpget and ncftpbatch.
ncftp covers point 1 pretty well from my testing. for 2, ncftpbatch does
this almost, you create your que and it processes it, just in the order you
submit them though. If/when the process gets terminated, a simple ncftpbatch
-d starts the processing back up. you might be able to write up some scripts
to manage the spool directory though.
Jeff
-+-
8 out of 10 Owners who Expressed a Preference said Their Cats Preferred
Techno.
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject
of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]