Arsen Arsenović <[email protected]> writes:
> Indeed they should be - but warning vs. error holds significance. A
> beginner is much less likely to be writing clever code that allegedly
> uses these features properly than to be building new code, and simply
> having made an error that they do not want and will suffer through
> confused.
Most programs already paste a chunk of Autoconf into their configure.ins
which turns on more diagnostics if it looks like the program is being
built by a developer. i.e. from Emacs:
AC_ARG_ENABLE([gcc-warnings],
[AS_HELP_STRING([--enable-gcc-warnings@<:@=TYPE@:>@],
[control generation of GCC warnings. The TYPE 'yes'
means to fail if any warnings are issued; 'warn-only'
means issue warnings without failing (default for
developer builds); 'no' means disable warnings
(default for non-developer builds).])],
[case $enableval in
yes|no|warn-only) ;;
*) AC_MSG_ERROR([bad value $enableval for gcc-warnings option]) ;;
esac
gl_gcc_warnings=$enableval],
[# By default, use 'warn-only' if it looks like the invoker of 'configure'
# is a developer as opposed to a builder. This is most likely true
# if GCC is recent enough and there is a .git directory or file;
# however, if there is also a .tarball-version file it is probably
# just a release imported into Git for patch management.
gl_gcc_warnings=no
if test -d "$srcdir"/.git && test ! -f "$srcdir"/.tarball-version; then
# Clang typically identifies itself as GCC 4.2 or something similar
# even if it is recent enough to accept the warnings we enable.
AS_IF([test "$emacs_cv_clang" = yes],
[gl_gcc_warnings=warn-only],
[gl_GCC_VERSION_IFELSE([5], [3], [gl_gcc_warnings=warn-only])])
fi])
So this is really not a problem.
> Indeed. I said facilitates, not treats equally. I think the veterans
> here won't lose much by having to pass -fpermissive, and I think that's
> a worthwhile sacrifice to make, to nurture the new without pressuring
> the old very much.
Until `-fpermissive' goes the way of `-traditional',
`-fwritable-strings'.
> On that note - lets presume a beginners role. I've just started using
> GCC. I run 'gcc -O2 -Wall main.c fun.c' and I get an a.out. It
> mentions some 'implicit function generation', dunno what that means - if
> it mattered much, it'd have been an error. I wrote a function called
> test that prints the int it got in hex, but I called it with 12.3, but
> it printed 1.. what the heck?
Have you actually seen anyone make this mistake?
> Why that happened is obvious to you and I (if you're on the same CPU as
> me), but to a beginner is utter nonsense.
>
> At this point, I can only assume one goes to revisit that warning.. I'd
> hope so at least.
>
> I doubt the beginner would know to pass
> -Werror=implicit-function-declaration in this case (or even about
> Werror... I just told them what -Wall and to read the warnings, which
> was gleefully ignored)
I'd expect a question from the newbie, directed at a web search engine.
> Hell, I've seen professors do it, and for a simple reason: they knew how
> to write code, not how to use a compiler. That's a big gap.
>
> The beginner here can't adapt - they don't know what -Wall means, they
> just pass it because they were told to do it (if they're lucky!).
If this is really such a bad problem, then how about clarifying the
error message?
> At the same time, they lose out on what is, IMO, one of the most useful
> pieces of the toolchain: _FORTIFY_SOURCE (assuming your vendor enables
> it by default.. we do). It provides effective bug detection, when the
> code compiles right. It regularly spots bugs that haven't happened yet
> for me.
>
> (and same goes for all the other useful analysis the toolchain can do
> when it has sufficient information to generate correct code, or more;
> some of which can't reasonably be a default)
>
> (on a related note, IMO it's a shame that the toolchain hides so many
> possibilities behind 'cult knowledge', depths of many manuals and bad
> defaults)
_FORTIFY_SOURCE is not really important enough to be considered here.
It's not even available everywhere.
> This sample is subject to selection bias. My testing targets mostly
> more modern codebases that have long fixed these errors (if they have
> active maintainers), and exclusively Free Software, so I expect that the
> likelyhood that you'll need to run `export CC='gcc -fpermissive'
> CXX='g++ -fpermissive'` goes up the more you move towards old or more
> corporate codebases, but, for a veteran, this is no cost at all.
>
> Is it that much of a stretch to imagine that a maintainer of a codebase
> that has not seen revisions to get it past K&R-esque practices would
> know that they need to pass -std=c89 (or a variant of such), or even
> -fpermissive - assuming that they could even spare to use GCC 14 as
> opposed to 2.95?
There's bash, which still tries to work on later 4.3BSDs (AFAIK), while
also building with just `gcc'.
> As an anecdote, just recently I had to fix some code written for i686
> CPUs, presumably for GCC 4.something or less, because the GCC I insist
> on using (which is 13 and has been since 13.0 went into feature-freeze)
> has started using more than the GPRs on that machine (which lead to hard
> to debug crashes because said codebase does not enable the requisite CPU
> extensions, or handle the requisite registers properly). I think this
> fits within the definition of 'worked yesterday, broke today'. Should
> that change be reverted? Replacing it with -mmore-than-gprs would make
> GCC more compatible with this old code.
>
> I don't think so.
>
> This is a sensitive codebase, and not just because it's written poorly,
> but because it's a touchy thing it's implementing, any change in
> compiler requires reverification. The manifestation here has *no*
> significance.
And the problematic part of the code in question is implementing
something like swapcontext in assembler, correct? That's a far cry from
breaking C code, which does have clearly defined (albeit not exactly
intuitive) semantics.
> ... speaking of that, if one builds their codebase without -std=..,
> they're risking more than just optimization changes breaking code that
> relies on bad assumptions, they're also risking a change in language
> semantics..
The Standards committee does have a reasonable track record of keeping
backwards compatibility, since they prioritize existing practice and the
existing body of C code. So I'm not too worried about that.
> With all that to consider, is it *really* a significant cost to add
> -fpermissive?
Yes, in case it goes away.
> Would that cost not be massively overshadowed by the cost
> of a compiler change? It feels like it's a footnote compared to
> checking whether added optimizations go against invalid assumptions
> (which is, by the way, also rectified by adding more hard and easy
> to see errors).
>
> I expect no change in behavior from those that maintain these old
> codebases, they know what they're doing, and they have bigger fish to
> fry - however, I expect that this change will result in:
>
> - A better reputation for GCC and the GCC project (by showing that we do
> care for code correctness),
A compiler should care about the correctness of its own code, not that
of others.
> - More new code being less error prone (by merit of simple errors being
> detected more often),
As I said, making such things errors will simply result in people
inserting `extern' declarations everywhere, which may or may not be
wrong. I've seen this happen before with my own eyes.
> - Less 'cult knowledge' in the garden path,
I guess search engines, and better written diagnostic messages, would be
enough to accomplish that?
> - More responsible beginners, and
> - Fewer people being able to effectively paint GNU and/or C/++ as the
> backwards crowd using a error-prone technique of yesteryear.
This never worked in GNU C++, right?
> (and yes, optics matter)
>
> Builds break. Builds breaking cleanly is a treat compared to the usual
> breakage. At least this breaks the few that do break with a positive
> outcome.
Builds shouldn't break, and the ``usual treat'' should also not
happen...