> No, the necessity to learn different commands (if someone wrote them) because 
> none of the standard Unix text tools is usable (on supposedly text files) 
> shows that it is a serious issue.[...]
> Beagle is not useful for searching[...]
> I don't understand what's good on complicating of reading the files.[...]
> And please don't try to argue PNG files are compressed internally[...]

This is twisting the issue to your liking and ignoring the arguments I'm
giving: the fact is unix commands are meant to be combined, and *no* you
can't expect "grep -r" to search in all possible formats, even formats
transporting text such as PDF or MSWord; heck, you can't even grep on
HTML files for anything else than small ASCII words.

Stop thinking that the only constraint on the files of a system is to be
able to "grep -r" them.  I explained why the space savings are an
advantage in some cases and that we try to adapt the tools to handle
these.  By your logic, we wouldn't use tar files because they hide
individual files from standard commands such as cp and wget.

The reason I mentionned Beagle or PNG is that we are not living in a
text file world; even index.sgml files aren't plain text: they are SGML;
try searching for "Loïc" when it's spelled as Loï or whatever.
We're living in a format in format in format world (make that
recursive).

>> Why HTML files aren't compressed?
> So, why they aren't?

I see you insist on getting an answer as if I didn't reply already: some
HTML files are compressed as dpkg -S html.gz will show you; there's no
need to compress in the http:// case, this is why you don't see them
compressed on the web; I already answered that http was already zipping
the contents.

> evince /usr/share/doc/tetex-doc/programs/dvips.pdf.gz

I'm sorry it doesn't work for you under Ubuntu; Debian fixed the list of
MIME types listed in the evince.desktop files, and it works fine under
Debian.  I'm sure this will reach Ubuntu soon.

> After all the years of experience with compressed documentation in Debian no 
> one can convince me it makes sense.
> And conversely, Debian people seem to be incapable of admitting any 
> fundamental problems.

This is because you don't wont to read the actual arguments people are
making: saving space is still an issue for example in live CDs, embedded
systems (where one still has to ship some files such as copyrights), or
simply to email them; zipping still provides a speed advantage when the
CPU is mostly idle while the disk is a bottleneck.  I already wrote
this, but you simply ignore the existence of the advantages.

Now, as I already said, I can live with the fact that the inventors of
gtk-doc index.sgml files do not want to impose to tools the support of
zipped files, and we can diverge from the default of zipping everything
to explicitly exclude these files (even if I think it's not the best
solution in the interest of Debian and our users), but please do not
argue that "Debian can not convince it makes sense" or that "Debian is
incapable of admitting any fundamental problem".  I think I proved above
that there are valid use cases for compression, it's omnipresent (in
protocols, file types, or in file systems as you noted), and that I'm
willing to try to avoid compressing gtk-doc index.sgml files if it's the
desire of gtk-doc upstream.

(This is especially true since issues with index.sgml.gz files were
never reported to Debian.)

-- 
gtkdoc-fixxref broken by compressed documentation
https://bugs.launchpad.net/bugs/77138
You received this bug notification because you are a member of Ubuntu
Bugs, which is the bug contact for Ubuntu.

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to