Re: ***SPAM*** Re: [dev] suckless wget/curl

2011-05-31 Thread Ethan Grammatikidis
On 29 May 2011, at 12:56 am, Dave Reisner wrote: I'm sure I'll get hated for this, but it's entirely possible to read chunked transfer with shell. I wrote a stupid proof of concept AUR agent in bash which handles keep-alives and chunked/gzip encoded data -- the only other dependencies are dd

Re: [dev] suckless wget/curl

2011-05-29 Thread Eckehard Berns
> [...] but it's entirely possible to read chunked transfer with shell. > I wrote a stupid proof of concept AUR agent in bash which handles > keep-alives and chunked/gzip encoded data -- the only other > dependencies are dd and gzip, and optionally yajl (for formatting > jsons). Cool, forgot about

Re: ***SPAM*** Re: [dev] suckless wget/curl

2011-05-28 Thread Dave Reisner
On Sat, May 28, 2011 at 01:54:44AM +0200, Eckehard Berns wrote: > > Except [...] it doesn't work for every site, it seems. I'll probably > > work out why when I have more time. > > Might be chunked transfer encoding. At least that was the first thing > that came to mind looking at your code. > >

Re: [dev] suckless wget/curl

2011-05-27 Thread Connor Lane Smith
If you're a fan of Plan 9, there's hget[1], which handles HTTP and FTP. It's not in 9base, since it requires a couple of other libraries, but you can get it in P9P. Alternatively you can grab a copy of Federico's Abaco[2] web browser, which bundles webfs and lets you mount the web as a 9P file syst

Re: [dev] suckless wget/curl

2011-05-27 Thread Anthony J. Bentley
On Fri, May 27, 2011 at 3:49 PM, ilf wrote: > Since I can't be the first to realize that, is there already a suckless > alternative for simple HTTP/FTP data transfer? NetSurf uses curl, but wants to get rid of it. It might be worth: a) looking at their plans for a fetch implementation, or b) if t

***SPAM*** Re: [dev] suckless wget/curl

2011-05-27 Thread Eckehard Berns
> Except [...] it doesn't work for every site, it seems. I'll probably > work out why when I have more time. Might be chunked transfer encoding. At least that was the first thing that came to mind looking at your code. > Still, I thought it was cute. Yes, I actually liked it. Chunked transfer en

Re: [dev] suckless wget/curl

2011-05-27 Thread zilog
> Since I can't be the first to realize that, is there already a > suckless alternative for simple HTTP/FTP data transfer? I find snarf very handy for http/ftp/gopher transfers. Maybe it sucks less. http://www.xach.com/snarf/ jgw

Re: [dev] suckless wget/curl

2011-05-27 Thread Connor Lane Smith
On 28 May 2011 00:34, Connor Lane Smith wrote: > Here's one I just hacked together for fun. It uses netcat. It > understands redirects, but that's it. Except I made a typo on line 10 -- 's/(/|(/' -- and it doesn't work for every site, it seems. I'll probably work out why when I have more time. St

Re: [dev] suckless wget/curl

2011-05-27 Thread Connor Lane Smith
Hey, Here's one I just hacked together for fun. It uses netcat. It understands redirects, but that's it. -8<- #!/bin/sh if test $# -ne 1; then echo "usage: $0 url" >&2 exit 1 fi wget (){ url="$(echo "$1" | sed 's/^http:\/\///')" host="$(echo "$url" | sed 's/\/.*//')" path="$(ec

Re: [dev] suckless wget/curl

2011-05-27 Thread Rob
Here's one I wrote: http://github.com/jeffwar/wgetlite 26K without debugging symbols, unfortunately it doesn't statically link yet (getaddrinfo), but it'll be pretty trivial to sort it.

Re: [dev] suckless wget/curl

2011-05-27 Thread ilf
On 05-28 00:00, u...@netbeisser.de wrote: why not use standard tools? 14K /usr/bin/GET 70K /usr/bin/ftp /usr/bin/GET: symbolic link to `lwp-request' /usr/bin/lwp-request: a /usr/bin/perl -w script text executable -- ilf Über 80 Millionen Deutsche benutzen keine Konsole. Klick dich nicht weg

Re: [dev] suckless wget/curl

2011-05-27 Thread Jakub Lach
> I think we can safely agree that both of these suck: > > -rwxr-xr-x 1 root root 123K 2011-01-26 20:11 /usr/bin/curl* > -rwxr-xr-x 1 root root 346K 2011-02-20 12:05 /usr/bin/wget* > > Since I can't be the first to realize that, is there already a suckless > alternative for simple HTTP/FTP data

Re: [dev] suckless wget/curl

2011-05-27 Thread u
Hi, On Fri, May 27, 2011 at 11:49:50PM +0200, ilf wrote: > I think we can safely agree that both of these suck: > > -rwxr-xr-x 1 root root 123K 2011-01-26 20:11 /usr/bin/curl* > -rwxr-xr-x 1 root root 346K 2011-02-20 12:05 /usr/bin/wget* > > Since I can't be the first to realize that, is there a

Re: [dev] suckless wget/curl

2011-05-27 Thread Stanislav Paskalev
I tend to use axel - http://axel.alioth.debian.org/ Regards, Stanislav Paskalev

[dev] suckless wget/curl

2011-05-27 Thread ilf
I think we can safely agree that both of these suck: -rwxr-xr-x 1 root root 123K 2011-01-26 20:11 /usr/bin/curl* -rwxr-xr-x 1 root root 346K 2011-02-20 12:05 /usr/bin/wget* Since I can't be the first to realize that, is there already a suckless alternative for simple HTTP/FTP data transfer? -