RE: wget

2003-07-11 Thread Chris W. Parker
Tom Hosiawa wrote: For god sakes man! Trim your posts!! ARGH! [snipped 128 lines of text that's not relevant to my post] Chris. -- redhat-list mailing list unsubscribe mailto:[EMAIL PROTECTED] https://www.redhat.com/mailman/listinfo/redhat-list

Re: wget

2003-07-11 Thread Tom Hosiawa
> I've got wget-1.8.2-9. It doesn't always happen, and when it does, it > seems to be those sites where the download happens "automatically" when > you get to the page. You know "here's the link, if the download doesn't > start in the next ten

Re: wget

2003-07-11 Thread Benjamin J. Weiss
I've got wget-1.8.2-9. It doesn't always happen, and when it does, it seems to be those sites where the download happens "automatically" when you get to the page. You know "here's the link, if the download doesn't start in the next ten seconds, click here&q

Re: wget

2003-07-11 Thread Jake Johnson
I am using GNU Wget 1.8.1 and it works fine. Regards, Jake Johnson [EMAIL PROTECTED] __ Plutoid - http://www.plutoid.com - Shop Plutoid for the best prices on Rims, Car Audio, and Performance Parts. On Fri, 11 Jul 2003, Tom

Re: wget

2003-07-11 Thread Tom Hosiawa
> I didn't have any problems either. What version of wget did you try? > > > Regards, > Jake Johnson > [EMAIL PROTECTED] > > __ > Plutoid - http://www.plutoid.com - Shop Plutoid for the best

Re: wget

2003-07-10 Thread Jake Johnson
I didn't have any problems either. What version of wget did you try? Regards, Jake Johnson [EMAIL PROTECTED] __ Plutoid - http://www.plutoid.com - Shop Plutoid for the best prices on Rims, Car Audio, and Performance

Re: wget

2003-07-10 Thread Tom Hosiawa
> I've had issues with wget, where the download link is actually one to a > re-directed mirror site. In such cases, I have to use Mozilla or > somesuch to find the actual download link, then use that for the wget. > For some reason the link following doesn't always work f

Re: wget

2003-07-10 Thread Benjamin J. Weiss
I've had issues with wget, where the download link is actually one to a re-directed mirror site. In such cases, I have to use Mozilla or somesuch to find the actual download link, then use that for the wget. For some reason the link following doesn't always work for me. YMMV. Ben

wget

2003-07-10 Thread Tom Hosiawa
I'm trying to download an iso using wget but nothing seems to happen, if I download it through mozilla it starts just fine (so its not a dead link) but this is the output of wget, I'm not sure what it means? wget http://linuxiso.org/download.php/327/KNOPPIX_V3.2-2003-06-06-EN.iso

Re: WGET information needed.

2003-06-18 Thread Willem van der Walt<[EMAIL PROTECTED]>
Hi, wget -b http://www.whatever.com/file.xxx will fetch file.xxx from the site and store it in the default directory. The -b makes wget go to the background. It will log the results of the transfer in wget-log in the default directory. Do man wget for the other options. You can do a lot more

RE: WGET information needed.

2003-06-17 Thread Chris W. Parker
[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]> wrote: > i need help on wget... > pls let me know the basic commands... > also inform me the default directory > where it stores the file.. here is the most basic way to use it. say you have a file on a website located a

RE: WGET information needed.

2003-06-17 Thread Michael Kalus
wget --help And you shall know. > -Original Message- > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] > Sent: Tuesday, June 17, 2003 1:03 PM > To: [EMAIL PROTECTED] > Subject: WGET information needed. > > > i need help on wget... > pls let me know t

Re: WGET information needed.

2003-06-17 Thread Jeff Kinz
On Tue, Jun 17, 2003 at 11:03:26PM +0600, [EMAIL PROTECTED] wrote: > i need help on wget... > pls let me know the basic commands... > also inform me the default directory > where it stores the file.. Umm - did you read the man page for wget yet? "wget" is the basic. comma

WGET information needed.

2003-06-17 Thread kototuku
i need help on wget... pls let me know the basic commands... also inform me the default directory where it stores the file.. sharif. -- redhat-list mailing list unsubscribe mailto:[EMAIL PROTECTED] https://www.redhat.com/mailman/listinfo/redhat-list

Re: Help with wget

2003-03-19 Thread nate
> Would some one please help me with the syntax to do this? I would prefer > to have wget execute the grep command or pipe the pages directly to grep > so that I don't have the website mirrored on my system. see the info page for wget try wget -qq http://some.site/blahblawhat

Help with wget

2003-03-19 Thread revooh_c
I am looking for some help with the wget command. I want to pull some information from a website, but a small portion. Unfortuently, the part I need is fairly difficult to get to so I'd like to do this in multiple steps. The first step I need to go is to use wget to pull pages from the

create mirror with wget or other tools

2002-05-20 Thread Josep M.
Hello! I tried of make mirror of some ftp´s using wget ,I execute this with the opotions "-nr -d -r -N -l inf --passive-ftp --no-host-directories" but don´t delete old files that in source directory was deleted,I tested with "--mirror" optioon,and don´t delete files than i

Re: wget?

2002-05-13 Thread Anand Buddhdev
On Mon, May 13, 2002 at 01:53:49PM -0700, daniel wrote: > tried that > but how then do i access the data being piped into perl? > > right now, this is all the perlscript says: > > #!/usr/bin/perl -w > > for (my $i = 0; $i <= scalar(@ARGV) - 1; $i++) { > print "$ARGV[0]\n"; > } Thi

Re: wget?

2002-05-13 Thread daniel
Original Message - > On Mon, May 13, 2002 at 01:19:58PM -0700, daniel wrote: > > > you know how you can use wget to take the contents of a webpage and print > > them to stdout? how could i then send that output into a perlscript > > quietly? i've gotten

Re: wget?

2002-05-13 Thread Anand Buddhdev
On Mon, May 13, 2002 at 01:19:58PM -0700, daniel wrote: > you know how you can use wget to take the contents of a webpage and print > them to stdout? how could i then send that output into a perlscript > quietly? i've gotten this far: > > wget "www..." --quie

wget?

2002-05-13 Thread daniel
you know how you can use wget to take the contents of a webpage and print them to stdout? how could i then send that output into a perlscript quietly? i've gotten this far: wget "www..." --quiet --output-document=- but it's what comes next that's got me how do i kee

wget mirroring

2002-04-26 Thread Devon Harding - GTHLA
Does anyone have any methods of mirroring web/ftp sites using wget? Also, if mirroring ftp sites, how would you mirror the accounts? _ Devon Harding System Administrator Gilat Latin America 954-858-1600 [EMAIL PROTECTED

Re: wget searching my proxy server

2002-04-19 Thread Bill Crawford
On Fri, 19 Apr 2002, JUANG wrote: > thank you Bill, it worked. > i installed wget using rpm, in boxA and boxB. in boxA wget is always try to > find proxy and boxB no, do you know why? I don't, but you should be able to find out. Just grep for "proxy&quo

Re: wget auth probs

2002-04-19 Thread Bret Hughes
On Fri, 2002-04-19 at 02:39, [EMAIL PROTECTED] wrote: > On 19 Apr 2002, Bret Hughes wrote: > > > I have a script that I am trying to port to a new content provider for > > up but wget is puking with the following: > > Have you tried the wget parameters >

Re: wget searching my proxy server

2002-04-19 Thread JUANG
thank you Bill, it worked. i installed wget using rpm, in boxA and boxB. in boxA wget is always try to find proxy and boxB no, do you know why? to "unset http_proxy" is if we install from tar ball, isn't it? peace, JUANG - Original Message - From: "Bi

Re: wget searching my proxy server

2002-04-19 Thread Bill Crawford
On Fri, 19 Apr 2002, JUANG wrote: > Hi all, > I try to download used wget, and wget is searching my proxy server and wget > can't find the proxy 'couse my IP proxy server was changed. "unset http_proxy" or, if you have it in a config file, remove it. > How

wget searching my proxy server

2002-04-19 Thread JUANG
Hi all, I try to download used wget, and wget is searching my proxy server and wget can't find the proxy 'couse my IP proxy server was changed. How do I use wget whitout searching proxy? if i use wget --proxy=no it was not work. but if i use lynx, the lynx browser didn't sea

Re: wget auth probs

2002-04-19 Thread Ed . Greshko
On 19 Apr 2002, Bret Hughes wrote: > I have a script that I am trying to port to a new content provider for > up but wget is puking with the following: Have you tried the wget parameters --http-user=user --http-passwd=password as described in "man

wget auth probs

2002-04-18 Thread Bret Hughes
I have a script that I am trying to port to a new content provider for up but wget is puking with the following: [bhughes@bretsony scripts]$ wget -S -N -v http://username:[EMAIL PROTECTED]/elevcomm/ECLocalNews.asp --01:56:56-- http://username:@www.tulsaworld.com/elevcomm/ECLocalNews.asp

Re: wget -g on problem

2002-03-13 Thread Pete Peterson
> Message: 9 > From: "Andrew Judge" <[EMAIL PROTECTED]> > To: "Redhat" <[EMAIL PROTECTED]> > Subject: wget -g on problem > > I am having problems with wget -g on Ecvery time I try this, I get an error > as follows: > > [root@iv

Re: wget -g on problem

2002-03-13 Thread Emmanuel Seyman
On Wed, Mar 13, 2002 at 01:05:14PM -0500, Andrew Judge wrote: > > Anyone got any tips. It was working at one point. Does using the "-v" option give any insight? Emmanuel ___ Redhat-list mailing list [EMAIL PROTECTED] https://listman.redhat.com/mai

wget -g on problem

2002-03-13 Thread Andrew Judge
I am having problems with wget -g on Ecvery time I try this, I get an error as follows: [root@ivan sunbiz]# wget -g on -nc ftp://w.x.y.z/*.dat --13:01:30-- w.x.y.z/*.dat => `.listing' Connecting to w.x.y.z ... Connection to w.x.y.z refused. unlink: No such file or directory

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread Cameron Simpson
On 20:16 18 Feb 2002, Charles Galpin <[EMAIL PROTECTED]> wrote: | See my other post regarding fetching web pages using perl. I haven't | done it myself, but I'm pretty sure there will be SSL enabled versions | of the libwww-perl modules on cpan.org I have to confess when I hacked up my https stuf

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread Charles Galpin
> > > dbrett wrote: > > > > > > >Does anybody know of a program which will get https pages? See my other post regarding fetching web pages using perl. I haven't done it myself, but I'm pretty sure there will be SSL enabled versions of the libwww-perl modules on cpan.org charles _

Re: sending data from wget into perl?

2002-02-18 Thread Charles Galpin
On Mon, 2002-02-18 at 14:19, daniel wrote: > i'm not sure if this is off topic or not > but i think that's my problem... > here's what i want to do > > 1. grab a page off the web > 2. process it with a perl script > > that's it > i thought

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread dbrett
This is similiar to what I was hoping to accomplish. Pull information off a web page put it into a webpage and then put in to a data base to be pulled out into webpages On Mon, 18 Feb 2002, Ed Wilts wrote: > On Mon, Feb 18, 2002 at 12:30:21PM -0800, David Talkington wrote: > > > > dbrett wrote

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread dbrett
Thanks I will have a look david On Mon, 18 Feb 2002, R P Herrold wrote: > On Mon, 18 Feb 2002, dbrett wrote: > > > Does anybody know of a program which will get https pages? > > htdump > > see: > >http://www.owlriver.com/projects/htdump/ > > wget m

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread dbrett
Thanks On Mon, 18 Feb 2002, Ed Wilts wrote: > On Mon, Feb 18, 2002 at 02:30:33PM -0600, dbrett wrote: > > Does anybody know of a program which will get https pages? > > curl > .../Ed > -- > Ed Wilts, Mounds View, MN, USA > mailto:[EMAIL PROTECTED] > > > > _

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread Ed Wilts
On Mon, Feb 18, 2002 at 12:30:21PM -0800, David Talkington wrote: > > dbrett wrote: > > >Does anybody know of a program which will get https pages? > > curl will do it, according to the man page. Hey, Ed, thanks for the > tip; I'd never heard of that one! We use curl a fair bit at work to dr

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread R P Herrold
On Mon, 18 Feb 2002, dbrett wrote: > Does anybody know of a program which will get https pages? htdump see: http://www.owlriver.com/projects/htdump/ wget may be coaxed into it as well, but it is trickier. -- Russ Herrold ___ Redhat-l

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread Jason Costomiris
On Mon, Feb 18, 2002 at 02:30:33PM -0600, dbrett wrote: : Does anybody know of a program which will get https pages? Yes, wget. At least the recent versions. Go to rawhide if you have to. -- Jason Costomiris <>< | Technologist, geek, human. jcostom {at} jasons {dot} org

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread Ed Wilts
On Mon, Feb 18, 2002 at 02:30:33PM -0600, dbrett wrote: > Does anybody know of a program which will get https pages? curl .../Ed -- Ed Wilts, Mounds View, MN, USA mailto:[EMAIL PROTECTED] ___ Redhat-list mailing list [EMAIL PROTECTED] https:

Re: retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread David Talkington
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 dbrett wrote: >Does anybody know of a program which will get https pages? curl will do it, according to the man page. Hey, Ed, thanks for the tip; I'd never heard of that one! - -d - -- David Talkington PGP key: http://www.prairienet.org/~dta

retrieving https was:Re: sending data from wget into perl?

2002-02-18 Thread dbrett
> >> > >>that's it > >>i thought something like > >> wget www.site.com/index.html | perlscript.pl > >>would work > >>but no > >>instead it just downloaded the index.html file and exited > > > >Silly goose. That's becaus

Re: sending data from wget into perl?

2002-02-18 Thread Ed Wilts
On Mon, Feb 18, 2002 at 11:19:26AM -0800, daniel wrote: > here's what i want to do > > 1. grab a page off the web > 2. process it with a perl script curl http://localhost | less Works for me. Replace less with your perl script. .../Ed -- Ed Wilts, Mounds View, MN, USA mailto:[EMAIL PR

Re: sending data from wget into perl?

2002-02-18 Thread David Talkington
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 David Talkington wrote: >>1. grab a page off the web >>2. process it with a perl script >> >>that's it >>i thought something like >> wget www.site.com/index.html | perlscript.pl >>would work >>bu

Re: sending data from wget into perl?

2002-02-18 Thread David Talkington
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 daniel wrote: >1. grab a page off the web >2. process it with a perl script > >that's it >i thought something like > wget www.site.com/index.html | perlscript.pl >would work >but no >instead it just downloaded the i

sending data from wget into perl?

2002-02-18 Thread daniel
i'm not sure if this is off topic or not but i think that's my problem... here's what i want to do 1. grab a page off the web 2. process it with a perl script that's it i thought something like wget www.site.com/index.html | perlscript.pl would work but no instead it

Mirroring tool (like wget) with throttling?

2001-02-06 Thread Jonathan Wilson
Hey, I've been using wget to mirror SuSE and Red hat updates. it works very well but since we're on ISDN I can only run it when no one else is around. I read it's man page and it doesn't look like it has any kind of throttling. I've been running it with the -m swi

Re: wget question (was Re: how to schedule ftp-transfer?)

2000-08-15 Thread John Aldrich
On Tue, 15 Aug 2000, you wrote: > Maybe the ncftpget command would work better for this? > Yep. Probably would, based on a cursory examination of the man page. :-) Looks like there's a bunch of different ways to do it. :-) John ___ Redhat-lis

Re: wget question (was Re: how to schedule ftp-transfer?)

2000-08-15 Thread Ken Kirchner
Maybe the ncftpget command would work better for this? On Tue, 15 Aug 2000, John Aldrich wrote: > On Tue, 15 Aug 2000, you wrote: > > On a related note, is there any way to make wget use passive mode when > > getting with ftp://. wget seems to work fine when used with http:

Re: wget question (was Re: how to schedule ftp-transfer?)

2000-08-15 Thread John Aldrich
On Tue, 15 Aug 2000, you wrote: > On a related note, is there any way to make wget use passive mode when > getting with ftp://. wget seems to work fine when used with http://, > but I'm behind a firewall and it doesn't work when using the ftp > prefix. I suspect it'

wget question (was Re: how to schedule ftp-transfer?)

2000-08-15 Thread Dave Reed
On a related note, is there any way to make wget use passive mode when getting with ftp://. wget seems to work fine when used with http://, but I'm behind a firewall and it doesn't work when using the ftp prefix. I suspect it's because wget uses active mode for the ftp transfers

Re: wget

2000-06-16 Thread David Talkington
Missed that ... thank you. -d -- David Talkington Community Networking Initiative [EMAIL PROTECTED] 244-1962 Peter Blomgren wrote: :David, : :> wget is driving me batty. Why does the following command: :> :> wget -r -l 2 \ :> http://www.guug.de/~winni/linux/CD-Writing/html/

Re: wget

2000-06-16 Thread Vidiot
>wget is driving me batty. Why does the following command: > >wget -r -l 2 \ >http://www.guug.de/~winni/linux/CD-Writing/html/ > >cause wget to begin retrieving files at the TOP level of that tree, >instead of in the directory which I have specified? The only option >th

Re: wget

2000-06-16 Thread Peter Blomgren
David, > wget is driving me batty. Why does the following command: > > wget -r -l 2 \ > http://www.guug.de/~winni/linux/CD-Writing/html/ > > cause wget to begin retrieving files at the TOP level of that tree, > instead of in the directory which I have specified? T

wget

2000-06-15 Thread David Talkington
Howdy. wget is driving me batty. Why does the following command: wget -r -l 2 \ http://www.guug.de/~winni/linux/CD-Writing/html/ cause wget to begin retrieving files at the TOP level of that tree, instead of in the directory which I have specified? The only option that sounded like it

Re: Wget

1999-12-20 Thread Tom Gilbert
* SoloCDM ([EMAIL PROTECTED]) wrote: > I've tried to force wget to follow links in a menu (the type you click > and find a whole list of other sources/links/options) to no avail. It > completely acts as if nothing exists in the menu and as if the menu > does not exist. The follo

Wget

1999-12-20 Thread SoloCDM
I've tried to force wget to follow links in a menu (the type you click and find a whole list of other sources/links/options) to no avail. It completely acts as if nothing exists in the menu and as if the menu does not exist. The following depicts what I'm using with wget: wget -