On Fri, Apr 13, 2007 at 01:09:50PM -0300, Henrique de Moraes Holschuh wrote: > On Fri, 13 Apr 2007, Roberto C. Sánchez wrote: > > On Fri, Apr 13, 2007 at 09:12:33AM +0200, Josselin Mouette wrote: > > > Le jeudi 12 avril 2007 à 21:15 +0200, Robert Millan a écrit : > > > > I think compression ratio is better than speed in most cases. With > > > > better > > > > compressed packages we save archive space, users save a lot of > > > > bandwidth, and > > > > the first CD/DVD can hold more stuff. That's important too. > > > > > > You wouldn't say that if you had a Via C3 with 10 Mbit bandwith. > > > > > Which is by far a minority situation. You are much more likely to end > > up with someone on a 384k or 512k DSL (or even slower ISDN link) with an > > opteron, xeon, athlon64 or the like. I'm not saying that your situation > > is not possible, simply that trading size for compression/decompression > > time would benefit far more people than it would "hurt." > > You know, make it intelligent enough, and you can have per-arch settings of > what compression to use. gzip for arm, lzma for amd64, and source, etc. > > The dak suite, and dpkg, certainly won't care. It would just work. > Cool.
Out of curiousity, how would source packages be handled? Would you allow multiple source upload formats or mandate the "best"? Also, if the uploader uploads a "wrong" format for a binary upload, would the archive repackage it or would it reject the upload? Regards, -Roberto -- Roberto C. Sánchez http://people.connexer.com/~roberto http://www.connexer.com
signature.asc
Description: Digital signature