On 29 May 2011, at 12:56 am, Dave Reisner wrote:
I'm sure I'll get hated for this, but it's entirely possible to read
chunked transfer with shell. I wrote a stupid proof of concept AUR
agent
in bash which handles keep-alives and chunked/gzip encoded data -- the
only other dependencies are dd
> [...] but it's entirely possible to read chunked transfer with shell.
> I wrote a stupid proof of concept AUR agent in bash which handles
> keep-alives and chunked/gzip encoded data -- the only other
> dependencies are dd and gzip, and optionally yajl (for formatting
> jsons).
Cool, forgot about
On Sat, May 28, 2011 at 01:54:44AM +0200, Eckehard Berns wrote:
> > Except [...] it doesn't work for every site, it seems. I'll probably
> > work out why when I have more time.
>
> Might be chunked transfer encoding. At least that was the first thing
> that came to mind looking at your code.
>
>
If you're a fan of Plan 9, there's hget[1], which handles HTTP and
FTP. It's not in 9base, since it requires a couple of other libraries,
but you can get it in P9P. Alternatively you can grab a copy of
Federico's Abaco[2] web browser, which bundles webfs and lets you
mount the web as a 9P file syst
On Fri, May 27, 2011 at 3:49 PM, ilf wrote:
> Since I can't be the first to realize that, is there already a suckless
> alternative for simple HTTP/FTP data transfer?
NetSurf uses curl, but wants to get rid of it. It might be worth:
a) looking at their plans for a fetch implementation, or
b) if t
> Except [...] it doesn't work for every site, it seems. I'll probably
> work out why when I have more time.
Might be chunked transfer encoding. At least that was the first thing
that came to mind looking at your code.
> Still, I thought it was cute.
Yes, I actually liked it. Chunked transfer en
> Since I can't be the first to realize that, is there already a
> suckless alternative for simple HTTP/FTP data transfer?
I find snarf very handy for http/ftp/gopher transfers.
Maybe it sucks less.
http://www.xach.com/snarf/
jgw
On 28 May 2011 00:34, Connor Lane Smith wrote:
> Here's one I just hacked together for fun. It uses netcat. It
> understands redirects, but that's it.
Except I made a typo on line 10 -- 's/(/|(/' -- and it doesn't work
for every site, it seems. I'll probably work out why when I have more
time. St
Hey,
Here's one I just hacked together for fun. It uses netcat. It
understands redirects, but that's it.
-8<-
#!/bin/sh
if test $# -ne 1; then
echo "usage: $0 url" >&2
exit 1
fi
wget (){
url="$(echo "$1" | sed 's/^http:\/\///')"
host="$(echo "$url" | sed 's/\/.*//')"
path="$(ec
Here's one I wrote: http://github.com/jeffwar/wgetlite
26K without debugging symbols, unfortunately it doesn't statically link yet
(getaddrinfo), but it'll be pretty trivial to sort it.
On 05-28 00:00, u...@netbeisser.de wrote:
why not use standard tools?
14K /usr/bin/GET
70K /usr/bin/ftp
/usr/bin/GET: symbolic link to `lwp-request'
/usr/bin/lwp-request: a /usr/bin/perl -w script text executable
--
ilf
Über 80 Millionen Deutsche benutzen keine Konsole. Klick dich nicht weg
> I think we can safely agree that both of these suck:
>
> -rwxr-xr-x 1 root root 123K 2011-01-26 20:11 /usr/bin/curl*
> -rwxr-xr-x 1 root root 346K 2011-02-20 12:05 /usr/bin/wget*
>
> Since I can't be the first to realize that, is there already a suckless
> alternative for simple HTTP/FTP data
Hi,
On Fri, May 27, 2011 at 11:49:50PM +0200, ilf wrote:
> I think we can safely agree that both of these suck:
>
> -rwxr-xr-x 1 root root 123K 2011-01-26 20:11 /usr/bin/curl*
> -rwxr-xr-x 1 root root 346K 2011-02-20 12:05 /usr/bin/wget*
>
> Since I can't be the first to realize that, is there a
I tend to use axel - http://axel.alioth.debian.org/
Regards,
Stanislav Paskalev
I think we can safely agree that both of these suck:
-rwxr-xr-x 1 root root 123K 2011-01-26 20:11 /usr/bin/curl*
-rwxr-xr-x 1 root root 346K 2011-02-20 12:05 /usr/bin/wget*
Since I can't be the first to realize that, is there already a suckless
alternative for simple HTTP/FTP data transfer?
-
15 matches
Mail list logo