hai, I have tried to develop the httpfs translator for hurd... Attaching herewith the code for the same. I am beginner to Hurd donno to what extent it rises to the expectation of Hurd Geeks.
It has a dependency on libxml2 library. the libxml2 library is used for parsing the HTML stream to get anchor tags and obtain file and directories stored in the web server. BTW i haven't decided on how to get the size of files in web server. http rfc enumerates 4 or 5 methods for finding content length, but still undecided on how to get them reliably. it works like this: hurd~# settrans -a tmp/ /hurd/httpfs http.us.debian.org/ hurd~/tmp# cd tmp/ hurd~/tmp# ls -l hurd~/tmp# cp -- can be used to down load.... The translator node can be made a file or directory the --mode option of httpfs will set that. The default is to a directory. BTW lynx, tar etc are not working properly on this file system, its the same case for ftpfs also...i think its because the reading is done in chunks by multiple calls to netfs_attempt_read() which is not supported by tar lynx etc. Can it be fixed? If set to a file it will support only file system read request. BTW, if its good enough ;) can it be put in the hurd cvs repository?? Thanks, Arun. __________________________________________________ Do You Yahoo!? Yahoo! Sports - live college hoops coverage http://sports.yahoo.com/
httpfs.tar.gz
Description: httpfs.tar.gz