https://gcc.gnu.org/bugzilla/show_bug.cgi?id=114074
--- Comment #6 from Richard Biener <rguenth at gcc dot gnu.org> ---
<bb 3> [local count: 1014686025]:
# a.4_18 = PHI <_4(8), 0(2)>
b = 2147480647;
_1 = ~a.4_18;
_2 = _1 * 2147480647;
a = _2;
foo ();
a.2_3 = a;
if (a.2_3 == 0)
goto <bb 5>; [5.50%]
else
goto <bb 4>; [94.50%]
<bb 4> [local count: 958878295]:
_4 = a.4_18 + -2;
a = _4;
if (_4 >= -2)
goto <bb 8>; [94.50%]
else
goto <bb 5>; [5.50%]
<bb 8> [local count: 906139989]:
goto <bb 3>; [100.00%]
and we get
(set_scalar_evolution
instantiated_below = 2
(scalar = _1)
(scalar_evolution = {-1, +, 2}_1))
)
this is ~{0, + -2} which I think we handle as -1 - X
And we get
(set_scalar_evolution
instantiated_below = 2
(scalar = _2)
(scalar_evolution = {-2147480647, +, -6002(OVF)}_1))
)
and that's wrong, the 2nd iteration _2 should be 1 * 2147480647 but indeed
the difference isn't representable in the signed integer increment of the
CHREC.
It's probably safes to go chrec_dont_know here, the alternative would be
probably (int){-2147480647u, +, -6002u}_1 which likely doesn't help much
in practice?