Re: GCC 4.1: Buildable on GHz machines only?
Peter Barada <[EMAIL PROTECTED]> writes: > Until package maintainers take cross-compilation *seriously*, I have no > choice but to do native compilation of a large hunk of the packages on > eval boards that can literally takes *DAYS* to build. And package maintainers will never take cross-compilation seriously even if they really want to because they, for the most part, can't test it. Very few people who are not cross-compiling for specific reasons have any sort of cross-compilation setup available or even know how to start with one, and it's the sad fact in software development that anything that isn't regularly tested breaks. Most free software packages are doing well if they even have a basic test suite for core features, let alone something perceived as obscure like cross-compilation. To really make cross-compilation work on a widespread basis would require a huge amount of effort in setting up automated test environments where package maintainers could try it out, along with a lot of help in debugging problems and providing patches. It seems unlikely to me that it's going to happen outside the handful of packages that are regularly used in cross-build environments and receive active regular testing by people who are part of the development team (like gcc). -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: GCC 4.1: Buildable on GHz machines only?
Alexandre Oliva <[EMAIL PROTECTED]> writes: > On May 16, 2005, Russ Allbery <[EMAIL PROTECTED]> wrote: >> And package maintainers will never take cross-compilation seriously >> even if they really want to because they, for the most part, can't test >> it. > configure --build=i686-pc-linux-gnu \ > --host=i686-somethingelse-linux-gnu > should be enough to exercise most of the cross-compilation issues, if > you're using a sufficiently recent version of autoconf, but I believe > you already knew that. What, you mean my lovingly hacked upon Autoconf 2.13 doesn't work? But I can't possibly upgrade; I rewrote all of the option handling in a macro! Seriously, though, I think the above only tests things out to the degree that Autoconf would already be warning about no default specified for cross-compiling, yes? Wouldn't you have to at least cross-compile from a system with one endianness and int size to a system with a different endianness and int size and then try to run the resulting binaries to really see if the package would cross-compile? A scary number of packages, even ones that use Autoconf, bypass Autoconf completely when checking certain things or roll their own broken macros to do so. > The most serious problem regarding cross compilation is that it's > regarded as hard, so many people would rather not even bother to try to > figure it out. So it indeed becomes a hard problem, because then you > have to fix a lot of stuff in order to get it to work. It's not just that it's perceived as hard. It's that it's perceived as hard *and* obscure. Speaking as the maintainer of a package that I'm pretty sure could be cross-compiled with some work but that I'm also pretty sure likely wouldn't work just out of the box, I have never once gotten a single bug report, request, or report of anyone cross-compiling INN. Given that, it's hard to care except in some abstract cleanliness sense (and I already got rid of all of the Autoconf warnings as best as I could figure out, in the abstract caring department). -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: Compiling GCC with g++: a report
Gabriel Dos Reis <[EMAIL PROTECTED]> writes: > Zack Weinberg <[EMAIL PROTECTED]> writes: > | (And I'd be less grumpy about coding to the intersection of C and C++ > | if someone coded up warnings for the C compiler to catch things > | outside the intersection.) > Consider that to be a follow-up that I'm willing to do, if these > preliminary patches are in. For sure, I do want to make sure that we > do not break things too easily. Even apart from this project and this discussion, this would be awesome to have, and I would be deeply grateful if you or someone else would implement this. Various people have requested over the years that some of the packages I maintain compile cleanly with a C++ compiler, and while I can test such compiles with special effort, being able to integrate the warnings about it into my normal make warnings build would be incredibly useful. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: Sine and Cosine Accuracy
Scott Robert Ladd <[EMAIL PROTECTED]> writes: > Gabriel Dos Reis wrote: >> Scott Robert Ladd <[EMAIL PROTECTED]> writes: >> | Then, as someone else said, why doesn't the compiler enforce -ansi >> | and/or -pedantic by default? >> Care submitting a ptach? > Would a strictly ansi default be accepted on principle? Given the > existing code base of non-standard code, such a change may be unrealistic. -ansi introduces strict namespace support in C, which then introduces all sorts of portability issues and ends up being impractical for a lot of real-world, cross-platform code that already uses facilities like Autoconf, unless one wants to spend a lot of time tracking down issues that really don't improve the code quality. I used to try to use -ansi in warnings builds and gave up even on that. -pedantic is significantly more useful in practice than -ansi is. It's really obnoxious to have to define some preprocessor variable just to be able to get an fdopen() prototype out of , even if I can see how it would be theoretically useful. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: Need GCC 3.3.6 PGP Signing Public Key
Gerald Pfeifer <[EMAIL PROTECTED]> writes: > For example, I could easily create a key for Gabriel Dos Reis > <[EMAIL PROTECTED]> and upload it to the key servers, or some evil hacker > could do something similar. And, in fact, people do; this is not just theoretical. There is an extra (unsigned) key for Russ Allbery <[EMAIL PROTECTED]> on the keyservers that I had nothing to do with. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: What is wrong with Bugzilla?
William Beebe <[EMAIL PROTECTED]> writes: > OK, then let me explain it to you. The problem with the GCC Bugzilla > reporting system is that it's a system that only other developers can > tolerate, let alone love. Setting aside for the moment that GCC is a software package *targetted* at developers, and hence the above is not necessarily a serious problem, I agree that the Bugzilla interface isn't exactly my favorite UI. However, I haven't figured out a better one either, so I don't have a firm platform on which to stand and complain. Bug reporting interfaces appear to be a hard problem. > The entire GCC website (of which GCC Bugzilla is a part) could be the > poster child for why developers should never be allowed to design user > interfaces, especially web user interfaces. Well, unless you have some user interface designers lined up and volunteering to help, this isn't really the most useful thing to say. GCC is a volunteer project; it uses the labor that it has available. > You just need to be willing to put in the effort to look a little more > professional and polished. The people maintaining the GCC web site put a great deal of effort into it. If there is a problem, lack of effort isn't the cause of it. You seem to be arguing that the people maintaining the web site have the wrong skill set to do a good job at it. Personally, the site looks great to me, but then I'm a developer, so... :) However, this is all just noise on a mailing list in the absence of someone with different ideas who is willing to do the work, just as with any other part of GCC. If you feel there is a better way to do the web site, propose patches, volunteer to help maintain it, and demonstrate why it's better. Just like with the rest of GCC. If you don't have time to do that, you could try to convince someone else to do it, or you could pay someone to do it. Just like with the rest of GCC. In the absence of such a contribution, you (and the web site) are at the mercy of the people who *are* willing to put the effort into it. Personally, I think they're doing a great job. But maybe I just have a tin eye for web site design too -- it's certainly possible. I'm not prejudging your argument that the web site could be better, just saying that saying so on the mailing list isn't going to do anything towards changing it. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: What is wrong with Bugzilla?
William Beebe <[EMAIL PROTECTED]> writes: > Then I would like you to review and contrast GCC Bugzilla > (http://gcc.gnu.org/bugzilla) with at least two others: Mozilla's > (https://bugzilla.mozilla.org) and Redhat's > (https://bugzilla.redhat.com/bugzilla/index.cgi). Mozilla's is a bit > more organized than GCC's (but not much) and it is organized as a > two-column page with a resonably lucid, short and sweet explaination on > the right. It shares the same ant picture with GCC's, which makes me > wonder if that image isn't part of some core page that comes with the > Bugzilla package. > The best of the two is the Redhat page. Instead of lots of controls on > the page, it has one to start with (search for a bug), with the more > detailed (and powerful) options located at the top of the page as menu > items. It's also good in that it has both expository information on the > page as well as news that someone looking for bugs might want to read. The Red Hat page is prettier, and I guess the GCC page could use some more orientation information, but they all feel roughly equal to me. (I actually prefer seeing clear links in the text of the page to the menu thing that Red Hat is doing.) But as previously mentioned, I'm not really the person you want reviewing this, most likely. > And if bugzilla is not working out, or if you want some ideas on how to > build better interfaces, there seem to be plenty of open bug tracking > packages on Sourceforge. A quick search for bugzilla produces a nice > long list, and at random I picked phpBugTracker > (http://phpbt.sourceforge.net). Well, the amount of work required to change bug tracking systems or build a new interface on top of Bugzilla is significant; if you're not planning on doing that work or paying someone to do it, it's fairly unlikely there will be any resources to do it. So far as I know, Bugzilla is working out fairly well from the perspective of the people working on GCC, which while not the whole story is at least as important as the bug reporting interface. > And I understand and appreciate that. But when the UI heavy hitters > aren't beating your doors down you either have to appeal to them in > the coummunity or else go and do what I do; look at what's out there > and (re)use design elements. Well, I don't *have* to do anything. GCC works great for what I want. But I think I understand what you're saying. GCC is using Bugzilla because someone not only got fed up with GNATS but volunteered to do all the work required to make the switch and keep things running afterwards. > As I mentioned before, have you thought to ask for help from Redhat? If > everybody looks to gcc as an important core tool, then perhaps those > power users could help with the site. I would say to go and talk to > Apple, that paragon of UI design, but I have no idea how Apple would > react or if it would be a complete waste of time and energy. There are Red Hat and Apple folks on this list. Maybe you can convince them to take such an idea to their companies. I have no idea. Whatever is done, it's very important that it be maintainable five years down the road. That's where single efforts often fail. > You've pointed out the lack of bandwidth to improve it, and I am > sympathetic (believe me, I really am). However, if someone makes a > comment on the look and feel of the site then you should make the > diplomatic equivalent to the comment "do you have a patch" when someone > makes a comment about some "questionable" issue with the compiler. I would generally agree, and that's basically what I'm trying to do here. However, it's also useful to point out to someone with a specific complaint how hard fixing that complaint might be. For example, if the report is "I want to link GCC as a library into my new IDE," people aren't going to just say "do you have a patch" without explaining why that's going to be hard to do. :) > I think we all suffer from Tin Eye Site Design - TESD. But if we don't > bring this issue up here, then where should it be brought up? I'm not saying this is the wrong place to bring it up. It's the only place to bring it up, so far as I know. I just think it's one of those things that can't really be discussed well in negatives. I really appreciated your links above to the other sites that you think are better laid-out; that's positive and presenting a particular improvement that can then be discussed. In general, though, I think it's going to take someone mocking something up and saying "here, I think this is better, what do other people think?" -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: What is wrong with Bugzilla?
R Hill <[EMAIL PROTECTED]> writes: > I just wanted to speak up and say that the idea of alarm bells going off > when people see a request for an email address from bugzilla is probably > one of the sillier things I've read this week. Anyone lucid enough to > be reporting a bug to an open source project like GCC realizes (i hope) > in some form how the whole internet-thing works. If you request > support, obviously people need a way to get in touch with you. If > you're looking at GCC and thinking "[EMAIL PROTECTED]@#$" then you may > have more bugs than you thought. ;) It's not the request for the e-mail address. It's that it's phrased as a login screen and a button to create an account. I know that I definitely pause and consider before I create an account at a web site. There are many on-line newspapers that I refuse to read articles from, for example, because I don't want to create an account. That creates a piece of authorization out there that I have to record a password for and that I'm to some degree responsible for. I think this is mostly just a matter of phrasing and presentation, though, not a fundamental problem. (Another difficulty is that presenting a login screen and inviting people to create an account also implies that if you weren't already invited to create an account, someone might be upset if you just make one. It has a very "members only" sort of feel to it.) -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: No download link from gcc.gnu.org
bhiksha <[EMAIL PROTECTED]> writes: > I simply cannot find any direct link to a downloadable source/binary > bundle for gcc4 from gcc.gnu.org. I went to gcc.gnu.org, clicked on "Mirror sites" under Download, chose an appropriate mirror for my region, clicked on "releases", and found gcc 4.0 and 4.0.1. > The list of releases on the releases page ends at 3.4.4. Every other > link Ive chased down stops at 3.4.4. Could you say exactly what pages you looked at? It's hard to fix the problem from the amount of information that you've given. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: 4.2 Project: "@file" support
DJ Delorie <[EMAIL PROTECTED]> writes: >> gcc -c ./@foop.cpp >> >> and of course the same goes for files with names that begin with '-'. > That only works if the argument reflects a file name, and not some other > syntactical sugar. Granted, gcc has no such arguments, but libiberty > has a wider scope than just gcc. dig -t txt proxy-service.best.stanford.edu @leland-ns0 comes to mind. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: Copies of the GCC repository
Daniel Jacobowitz <[EMAIL PROTECTED]> writes: > On Tue, Nov 22, 2005 at 05:07:12PM +0100, Gabriel Dos Reis wrote: >>(2) Is it normal that "svk push" takes more than 5 minutes to complete? >>If so, that does not match the speed argument I've seen for the >>move to SVN. > SVN is fast. SVK, in many operations, seems to be quite slow (but fast > on others). Part of the problem with svk is that for some reason it appears to devour memory. I can only barely (and with a lot of swapping) svk sync a large repository with 256MB of system RAM. svn itself doesn't need anywhere near as much. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: [OT] RE: GCC mailing list archive search omits results after May 2005
Daniel Jacobowitz <[EMAIL PROTECTED]> writes: > On Thu, Dec 15, 2005 at 08:20:39PM -, Dave Korn wrote: >> If gmane is free, please supply me a set of the source code to the gmane >> application, so that I can modify it and use it for my own purposes. > http://gmane.org/dist.php > The bits I checked were under the GPL. Yup, last time I checked all of Gmane was running on free software. The underlying news server is INN, and Lars was making available all the bits he's running on top of it to do all the fun interface stuff. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: Compiling programs licensed under the GPL version 2 with GCC 4.4
f...@redhat.com (Frank Ch. Eigler) writes: > Robert Dewar writes: >> Discussion of FSF policy on licensing issues is also off-topic for >> this mailing list. > Perhaps, yet the libgcc exception licensing issues were quite > prominently discussed right here, and not too many months ago. > Florian's concern sounds linearly connected to that. If this is as > trivial a matter as some people seem to hint, perhaps someone can supply > a link to a prior discussion for it. Furthermore, the people Robert is telling him to go ask are not replying to their e-mail. Given that, on-topic or not, I think it's hardly surprising for the issue to come up here. The most effective way to keep it from coming up here would seem to be for them to start answering their e-mail. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Why not contribute? (to GCC)
Manuel López-Ibáñez writes: > This seems to be the question running around the blogosphere for several > projects. And I would like to ask all people that read this list but > hardly say or do anything. > What reasons keep you from contributing to GCC? The last time that I attempted to contribute to an FSF project (Autoconf, many years ago), I got the legal paperwork for the employer component and attempted to find someone at Stanford who was willing to sign it, entirely without success. It was quickly turning into a hassle that was going to consume considerably more time than the time I would have spent working on the contribution. I'm sure that I could eventually work through the process, but for the occasional and minor contributions that I would have time to make to FSF projects, it's just not worth the time and energy. There are many other projects to contribute to that don't require this additional overhead. I find the GCC project fascinating (in a largely positive way) as an example of a large successful free software project and have been following the mailing list since egcs, so I'm still following the mailing list and learning a lot about project management and approval processes and the like, and I really appreciate people doing that in public where others can learn from it. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Why not contribute? (to GCC)
Dave Korn writes: > I don't quite see that. If the company disclaims ownership of the > stuff that you create working on GCC as part of your job, well, they've > disclaimed ownership of it, regardless of the fact that you created it > while working on GCC as part of your job, no? Berne convention and all > that, it is you who created the creative work, the copyright is yours > *unless and until* you have assigned it to someone, and usually that > someone is your employer because the assignment is part of your contract > of employment. I should probably not really be responding to legal threads on this list, and I'm sure someone will point out that it's more complicated than this and one needs to talk to a real lawyer, but note that the disposition of copyright around work-for-hire is an aspect of the law, not something that exists only in contracts. Even if your contract with your employer says absolutely nothing about copyright, work done for hire for your employer is still owned by that employer. I believe the contract would have to explicitly say that this is *not* the case for you to be able to retain ownership of copyright of work that you did for hire. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Why not contribute? (to GCC)
contributors need to work through. Please also note that much of this information is about ten years old, and the situation may have changed somewhat. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: GFDL/GPL issues
Mark Mitchell writes: > Basile Starynkevitch wrote: >> Does that mean that even if a MELT plugin package appears in Debian, it >> could not contain any documentation? > I thought Debian didn't like the GFDL at all. But, in any case, that's > really a question for the Debian folks; I don't have any involvement in > Debian. This is not the place to discuss this in any further detail, obviously, but just to clarify for those watching this part of the discussion: Debian is not horribly happy with the GFDL, but does consider it to be a free license provided that there are no Front Cover or Back Cover texts and no Invariant Sections. Debian judges all licenses for all material by the same DFSG standards as software licenses and considers the presence of texts covered by those three provisions of the GFDL to be unmodifiable sections, hence non-free, and not permitted in the Debian distribution. But as long as that aspect of the license is not used, the GFDL is a DFSG-free license. Provided that the software does not conflict with the terms of the GPL or GFDL by combining things with conflicting terms in such a way as to make them unredistributable (and dual-licensing would resolve that, obviously), I don't believe Debian would have a problem with the situation that you describe. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: GFDL/GPL issues
"Alfred M. Szmidt" writes: > It should be noted that Debian considers the GFDL a non-free > /software/ license; which it is, but then the GFDL is not a software > license to begin with. The official Debian position is that the distinction between a software license and a non-software license for the sort of material distributed in Debian is an artificial and meaningless distinction because of, among other reasons, exactly the use case being discussed in this thread. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: SVN: Checksum mismatch problem
Bruce Korb <[EMAIL PROTECTED]> writes: > I do that also, but I am also careful to prune repository > directories (CVS, .svn or SCCS even). I rather doubt it is my RAM, > BTW. Perhaps a disk sector, but I'll never know now. (Were it RAM, > the failure would be random and not just the one file.) The original > data were rm-ed and replaced with a new pull of the Ada code. Yup, I've seen change of capitalization of a single letter in files due to bad disk sectors before, even on relatively modern hardware. It's a single bit error, so it's an explainable failure mode. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>
Re: fatal error: gnu/stubs-32.h: No such file
Jonathan Wakely writes: > On 31 July 2013 20:44, Matthias Klose wrote: >> if you mention distribution specific packages, please add the ones needed for >> some distributions. For Debian/Ubuntu this would be g++-multilib if the >> architecture is multilib'ed, g++ otherwise. > That's not the package that provides gnu/stubs-32.h, is it? I thought > it was something like libc6-dev-i386? Please correct > http://gcc.gnu.org/wiki/FAQ#gnu_stubs-32.h if I'm wrong. gcc-multilib and g++-multilib depend on all the various packages that you need to have. They will, among other things, install libc6-dev-i386. For example, on a current wheezy system, you will see the following dependency chain: gcc-multilib -> gcc-4.7-multilib -> libc6-dev-i386 but also various other things like lib32gcc1. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Building gcc on Ubuntu 11.10
Ian Lance Taylor writes: > Nenad Vukicevic writes: >> Has anybody tried to build 4.7 on Ubuntu 11.10 system. I am getting the >> following linking problem (no special configure switches): >> >> /usr/bin/ld: cannot find crt1.o: No such file or directory >> /usr/bin/ld: cannot find crti.o: No such file or directory >> /usr/bin/ld: cannot find -lgcc >> /usr/bin/ld: cannot find -lgcc_s >> >> Noramly they under /usr/lib64, but 11.10 has them under >> /usr/lib/x86_64-linux-gnu. > Yes. Debian moved everything for some reason. The reason, for the record, is because Debian wants to be able to support multiarch with more than two architectures. The /lib32 vs. /lib64 distinction doesn't allow one to use the same underlying machinery to easily install, say, armel library and development packages because you're doing development in a cross-compiled environment. The general /lib/ layout allows you to install packages from as many different architectures as you desire. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Building gcc on Ubuntu 11.10
Ian Lance Taylor writes: > Russ Allbery writes: >> The reason, for the record, is because Debian wants to be able to >> support multiarch with more than two architectures. The /lib32 >> vs. /lib64 distinction doesn't allow one to use the same underlying >> machinery to easily install, say, armel library and development >> packages because you're doing development in a cross-compiled >> environment. The general /lib/ layout allows you to install >> packages from as many different architectures as you desire. > The GNU tools have handled cross-compilation for decades, so I don't > find this answer convincing as stated. Nothing needed to change to make > cross-compilation work. It doesn't have anything to do with the GNU tools or with gcc itself. I agree that this change isn't necessary to make gcc work. It's about package management. The goal is to be able to install the same Debian package for various different architectures on the same system at the same time, and one of the scenarios this helps with is cross development. For example, suppose I'm doing development on an amd64 box targeting armel and I want to use Kerberos libraries in my armel application. I'd like to be able to install the armel Kerberos libraries on my Debian system using regular package management commands, just like any other package. Then I want to have the compiler, when building for armel, find the appropriate armel header files and libraries to link my armel binaries against. The Debian multiarch layout strategy allows this to happen. It's an additional feature that isn't available in the lib32/lib64 model, since that layout doesn't generalize to arbitrary additional architectures. This isn't the only use case, just an example. The embedded Debian folks are very happy with this particular feature, which is why it came to mind. > The lib64 directory does not exist for cross-compilation. It exists > because the kernel supports two different native ABIs. Note there too the lib32/lib64 solution doesn't generalize. What if there is an additional ABI whose distinction isn't the bit size, such as the armhf architecture in Debian where the distinction is support for hardware floating point? Using the triplet offers a general solution to this problem and allows any number of native ABIs. > I strongly support the idea of a compatibility symlink so that older gcc > releases will continue to work. I'm completely agnostic on this point. I only replied to provide the explanation for why Debian chose this layout rather than lib32/lib64. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Building gcc on Ubuntu 11.10
Andreas Schwab writes: > Russ Allbery writes: >> For example, suppose I'm doing development on an amd64 box targeting >> armel and I want to use Kerberos libraries in my armel application. >> I'd like to be able to install the armel Kerberos libraries on my >> Debian system using regular package management commands, just like any >> other package. > Just add a --sysroot option to the packager (that also transparently > translates symlinks), case closed. While this addresses the cross development case, it doesn't address the multiple native ABI case. It's elegant to be able to use the same solution to address multiple problems, particularly since there are other limitations with --sysroot. For example, I'd like apt-get upgrade, my pinning, my repository preferences, and so forth to apply to *all* of my installed packages rather than having to duplicate that setup work inside various alternative package roots. Which you can do with --sysroot, of course, by adding more complexity to the packaging system and having it track all the --sysroots that you've used, but with multiarch you get those properties for free. Anyway, I'll stop discussing this here, as it's not really on topic. I just wanted to provide some background, since I realize on the surface it's a somewhat puzzling decision. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Building gcc on Ubuntu 11.10
Toon Moene writes: > Thanks for the explanation. Is there a rationale document (and a design > document that explains what we have to expect from this change) > somewhere on the Debian web site ? > I couldn't find it, but perhaps I didn't search it right. The documentation that I'm aware of is at: http://wiki.debian.org/Multiarch The first link there is a more comprehensive statement on the discussion that we just had. > If this is such an obvious solution to the problems you mention, one > would assume that other distributors would clamor for it, too. Well, the approach that Debian (and Ubuntu) chose is a significant amount of very disruptive work (as you're all noticing!). It's considerably more disruptive in some ways than the lib32/lib64 solution, particularly if one takes into account the other things that weren't discussed in this thread, such as overlap in package contents between packages for two different architectures. These sorts of designs are always tradeoffs between the level of disruption and the level of long-term benefit, and I can certainly understand other people making different decisions. Debian and Ubuntu are pursuing a direction that we think is more comprehensive and will provide a lot of long-term benefit, but I think it's fair to say that the jury is still out on whether that was the right tradeoff to take. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Request for warnings on implicit bool to int conversions
Gabriel Dos Reis writes: > I am trying to understand what the real issue is here. Do you want > -Wimplicit-char-to-int to? -Wimplicit-short-to-int? If not, why? > where to stop? I think it's more about conversion *to* bool than from bool, and it catches places where code has been partly converted to bool (generally because it predated C99) but the error conversion wasn't done properly. For example, while returning true or false is common, returning 0 or -1 as a boolean return value where 0 is success is also common (as with UNIX library interfaces). If someone messes up such a conversion and has a: status = -1; somewhere, they get entirely the wrong result if status becomes a bool. This warning would pick up cases like that. The warnings about doing things like: bool b = true; b++; b += 3; are somewhat akin to the warnings about doing arithmetic on void * pointers -- code like that is possibly a sign that there's something flawed with the algorithm and it should be rewritten to treat booleans as actual booleans. (For example, b++ could easily wrap, and unexpectedly fast depending on the size of bool on a platform.) -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Request for warnings on implicit bool to int conversions
Gabriel Dos Reis writes: > I can easily see why an implicit conversion from int to bool might cause > a problem, even if that is what the language standard mandates -- just > look at the most common misuses of strcmp. But, that is not what the > proposer requested, which got me scratching my head. Yeah. But I suspect it was a mistaken statement. The subject line from the referenced comp.lang.c thread was: c99 and the lack of warnings when int operations are applied to a bool which I think is best caught by the conversion *to* bool when the result is stored, rather than the conversion *from* bool to perform the operation. I could see the other direction being marginally helpful in catching people adding bools together, which may not make a lot of sense, but it doesn't seem as likely to cause bugs. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: Request for warnings on implicit bool to int conversions
Russ Allbery writes: > Yeah. But I suspect it was a mistaken statement. The subject line from > the referenced comp.lang.c thread was: > c99 and the lack of warnings when int operations are applied to a bool > which I think is best caught by the conversion *to* bool when the result > is stored, rather than the conversion *from* bool to perform the > operation. I've just briefly reviewed the thread, and it was about both, including some discussion of warning about any operation that promotes a bool to some other integer type, on the grounds that a bool would ideally be treated as a special enumeration that didn't behave like an integer. (With others pointing out that that isn't the programming language that we have, as appealing as it might have been if designing C from scratch.) But I think most of the *practical* problems would be caught by warning about the implicit integer conversion to bool, if there's a way to wedge that warning into the language. (I suspect it might be hard because the integer conversion may happen under the hood in lots of places that people don't expect, but I don't know much about the internals.) -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: RFC: -Wall by default
Gabriel Dos Reis writes: > If it is the non-expert that would be caught in code so non-obvious that > -Wuninitialized would trip into false positives, then it is highly > likely that the code might in fact contain an error. I wish this were the case, but alas I continue to see fairly trivial false positives from -Wuninitialized. Usually cases where the initialization and the use are both protected by equivalent conditionals at different places in the function. Personally, as a matter of *style*, I eliminate such cases either by initializing the variable or restructuring the function. But this is very much a question of style, not of correctness. -- Russ Allbery (r...@stanford.edu) <http://www.eyrie.org/~eagle/>
Re: RFH: GPLv3
Alexandre Oliva <[EMAIL PROTECTED]> writes: > How about, after the 4.2.1 release, switch the branch to GPLv3 and then > release 4.2.3, without any functional changes, under GPLv3? > The skipped minor version number (to a .3, no less) and the quick > succession of releases would probably hint at the license upgrade, and > it would probably make the FSF happier with a GCC release under GPLv3 in > a short time-frame. Just a GCC user, not a developer, so please weigh my opinion appropriately, but I for one would strongly prefer that the GCC project not use "cute" version number changes as a form of semaphore communication to users. That's what release notes are for. Version numbers are the most useful when they are monotonically increasing, follow a normal arithmetic progression, and follow a consistent policy about how they change with each release. I personally don't care of the GPLv3 change gets a major version number change or a minor one, but please make the first 4.3 release 4.3.0, and please maintain the convention that the next minor release after 4.2.1 is 4.2.2. Anything else is needlessly confusing IMO and raises pointless questions. -- Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>