On Sun, Oct 27, 2002 at 07:43:02PM -0400, Oleg wrote: > Colin Watson wrote: > > You get most of the speed increases by recompiling a very small number > > of things. > > This is true for applications in the following wording "you get most > of the speed increase by optimizing small parts of the program". For > something like Debian however, you can't possibly know in advance > where the users' bottleneck will happen to be. BTW, Gentoo users say > their systems "feel" a lot faster overall.
That's probably because they're using gcc 3.2 and ELF prelinking already (at a guess). This is coming to Debian, but requires more transitional work. Anyway, "feel" doesn't wash. Every time this comes up the answer is to request a real benchmark: to my knowledge the only time someone's ever provided one is in the case of openssl, which nowadays in unstable has versions optimized for a number of processes. Bottom line, when you're talking about adding 10Gb or so to the archive size it's a trade-off between helping users and not pissing off mirrors. If you forget about the latter, you'll quickly lose the former too, especially when there are ways to help users without imposing that 10Gb hit. > As to compiling from deb sources (some else mentioned it in this > thread), the one big inconvenience is that "apt-get upgrade" will > overwrite your optimized program as soon as its next [sub]version is > available. There are plenty of well-documented ways round that, depending on exactly what behaviour you want. There's also apt-src/apt-build to help you manage it automatically. -- Colin Watson [[EMAIL PROTECTED]] -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]