Arsen Arsenović <[email protected]> writes:
> Seems that the algorithm and I agree. I don't see any use in the int()
> guess that the compiler does, besides accepting old code (which makes it
> candidate for -fpermissive).
I believe:
extern int foo ();
covers at least 50% of all C functions, since they are generally defined
to return int (as an error indication) with no narrow types in any
prototyped parameter list. That's a reasonable enough guess.
> We can diagnose these without invoking expensive machinery, unlike
> mismatched declarations.
How so? Remember that `extern int foo ()' declares a function with no
parameter specification returning int.
Thus, it could mean either:
foo (a, b)
char *a;
long b;
or
foo (x, y)
double x;
double y;
or perhaps even
foo (va_alist)
va_dcl
The function may even be defined with a prototyped parameter list, as
long as said parameter list does not contain any narrow types which are
subject to default argument promotions.
> I don't see why these not being diagnosed by default makes erroring on
> the far simpler implicit case not valuable (even if it leads to people
> just consciously inserting wrong declarations - that means they
> acknowledged it and made a poor decision).
All this will lead to is people making what you deem to be a ``poor''
decision. Especially if the people who have to solve the build problems
are not the original author(s) of the program being built.
> They can be caught in multiple places when obvious, such as when not
> done explicitly. These are two different errors.
An implicit function declaration is NOT an obvious error. It simply may
be an error. What is the error here?
a (b, c)
long c;
{
return pokeaddr (c, b * FOO);
}
/* in another translation unit */
b ()
{
a (1, 0L);
}
GCC is not a judge of other people's code, and shouldn't try to be one.
> I can't say. I don't have memory of the traditional mode outside of
> cpp.
-traditional tried to make GCC a K&R compiler. It wasn't completely
K&R, but it worked well enough for most cases. There,
- `extern' definitions take effect even outside the scope in which
they are declared.
- typeof, inline, signed, const and volatile are not recognized.
- unsigned narrow types promote to unsigned int.
- string constants are always writable. (BTW, -fwritable-strings
was also removed, so you can see why I'm sceptical about
any commitment to `-fpermissive' if it is ever introduced.)
- register allocation is not performed within functions that call
setjmp.
- bit-fields are always unsigned.
- single precision floating point arithmetic is carried out in
double precision.
- __STDC__ is not defined.
> The changes proposed today, however, are a few bits of entropy
> relevant in a few places - the overhead is low.
Yet it's rather pointless, as I explained above.
> It's not, though. That's why this is being conversed about. Even
> highly diligent people miss these (and even not-so-diligent people like
> me do - I've had more of these found by me passing -flto than I did by
> remembering to use an extra four -Werror=... flags, and that, of course,
> misses the int(...) case - which is not a useful one, usually what I
> meant there was (void), but strict-prototypes are a story for another
> day or year).
I can't say I've seen any such errors myself. Perhaps people should be
told to run lint instead, whose sole job is to prevent these errors from
happening?