This one time, at band camp, Michael Gilbert said: > is it possible to ignore greater compression ratios for larger > archives? Larger archives are validly more compressible than smaller > archives because the more bits you have, the more potential there is > for duplication and hence compression.
At the moment, no. I am also not sure it would be the right thing to do - see below. > is the compression ratio simply computed by dividing the archive size > by the uncompressed size? Almost: it is on a file by file basis within the archive, but otherwise you are correct. So one hugely compressed file within an archive will trigger the test. It should be noted that this is on purpose - slipping a hypercompressed file into an otherwise normal appearing archive is exactly how many archive bombs work. Since the test is a per-file test, I am not sure that the compression being measured will change significantly with larger archives (although I am amenable to being shown the error of my thinking). Take care, -- ----------------------------------------------------------------- | ,''`. Stephen Gran | | : :' : [EMAIL PROTECTED] | | `. `' Debian user, admin, and developer | | `- http://www.debian.org | -----------------------------------------------------------------
signature.asc
Description: Digital signature