On Mon, Jan 22, 2007 at 11:16:06AM -0800, Ian Lance Taylor wrote:
> .... The new -Wstrict-overflow
> warning will issue warnings for each case where gcc assumes that
> signed overflow is undefined.
>
> To be clear, this -Wstrict-overflow option generates a lot of false
> positives. That is because there are a lot of cases where gcc will
> optimize code in ways that would fail if the numbers involved were
> close to INT_MAX or INT_MIN. For example, when compiling the gcc/*.c
> files, I get 62 warnings. ...
>
> Some comments about those 62 warnings. First, each one of those
> warnings is a case where gcc implemented an optimization which could
> not have been implemented when using -fwrapv. Now, a couple of
> examples.
>
> In bb-reorder.c at line 266 we find this:
>
> find_traces_1_round (REG_BR_PROB_BASE * branch_threshold[i] / 1000,
> max_entry_frequency * exec_threshold[i] / 1000,
> count_threshold, traces, n_traces, i, &heap,
> number_of_rounds);
>
> REG_BR_PROB_BASE is a #define macro with the value 10000. gcc
> converts (10000 * branch_threshold[i] / 1000) into (10 *
> branch_threshold[i]). This minor optimization is only possible
> because signed overflow is undefined. If branch_threshold[i] had the
> value (INT_MAX / 10 - 1), say, the optimization will cause the
> expression to have a different result (assuming standards
> twos-complement arithmetic wrapping). I see 17 warnings of this
> form.
>
> In expmed.c we find several times (size - 1 < BITS_PER_WORD). gcc
> converts this to (size <= BITS_PER_WORD). This optimization is only
> possible because signed overflow is undefined.
Here's a thought: assuming we can ignore overflow is equivalent to
assuming that we have more information about the range of a variable
than is provided by the value. What if the other proposal we say
recently, for assume(), were added? e.g.
assume (size < INT_MAX);
assume (branch_threshold[i] < INT_MAX / 10000);