On 11/18/2016 10:25 AM, Jakub Jelinek wrote:
On Fri, Nov 18, 2016 at 10:14:09AM -0700, Martin Sebor wrote:
Because people make mistakes and warnings help us avoid them (isn't
that obvious?)  Just because we get it right most of the time doesn't
mean we get right all of the time.  The papers and exploits I pointed
to should provide ample evidence that zero allocations are a problem
that can have serious (and costly) consequences.  We (i.e., Red Hat)
recognize this risk and have made it our goal to help stem some of
these problems.

Are you talking about cases where user writes malloc (0) explicitly, or
where user writes malloc (n); and the optimizers figure out n is 0,
either always, or e.g. because the compiler decided to duplicate the code
and and in one of the branches it proves it is 0, or you want to add a
runtime warning when malloc is called with 0?
E.g. I don't see how writing malloc (0) in the code should have anything
to do with any overflows.  The cases where jump threading creates cases
where we call functions with constant arguments and we then warn on them
is also problematic, often such code is even dead except the compiler is not
able to prove it.
I would strongly suggest against using explicit vs implicit for this kind of thing. Various transformations can take an implicit and turn it into an explicit. Various transformations may take what starts off as a complex expression and eventually through cse, cprop, etc the expression collapses down to a 0 which could be explicit or implicit.

I believe we should be warning on trying to allocation 0 bytes of memory via malloc, realloc or alloca, with the exception of a non-builtin alloca with no return value, but I think we've covered that elsewhere and Martin's code will DTRT more by accident than design.

Are there going to be false positives, probably. But I think Martin has already shown the false positive rate to be far lower than many other warnings.

Are there cases where someone might explicitly do malloc (0) and do so in a safe, but potentially non-portable manner. Absolutely. But that doesn't mean we should cater to that particular case.



It IMHO doesn't belong to either of these.

You've made that quite clear.  I struggle to reconcile your
position in this case with the one in debate about the
-Wimplicit-fallthrough option where you insisted on the exact
opposite despite the overwhelming number of false positives
caused by it.  Why is it appropriate for that option to be in
-Wextra and not this one?

It also matters what one can do to tell the compiler the code is fine.
For e.g. -Wimplicit-fallthrough and many other warnings one can add
a comment, or attribute, etc. to get the warning out of the way.
But the patch you've posted for the alloca (0) stuff contained
changes of n to n + !n, so the warning basically asks people to slow
down their code for no reason, just to get the warning out of the way.
I'm not buying that as a valid argument. I suspect the amount of computing power to shuttle around the messages on this topic already outweighs the computing power for all the + !ns that might get added to code to avoid this warning. Note that folks routinely have put initializers in code to avoid false positives from our uninitialized variables.

IMHO, the biggest argument against the + !n is that it makes the code less clear. I think a simple assertion is a better choice. It clearly documents that zero is not expected, stops the program if it does indeed occur and can be optimized away just as effectively as +!n when n is known to be nonzero.

Jeff

Reply via email to