https://gcc.gnu.org/bugzilla/show_bug.cgi?id=116137
Andrew Macleod <amacleod at redhat dot com> changed:
What |Removed |Added
----------------------------------------------------------------------------
Resolution|--- |INVALID
CC| |amacleod at redhat dot com
Status|UNCONFIRMED |RESOLVED
--- Comment #1 from Andrew Macleod <amacleod at redhat dot com> ---
I believe the test case is flawed.
I think the 0x80 is considered an unsigned constant, and then x and y are
promoted to unsigned. This will cause them to be sign extended, so 0x80 becomes
0xFFFFFF80, and the initial comparison:
if (x == 0x80 || y == 0x80)
is never true. In fact, if you look at the output, the front end actually
eliminates that statememnt... In the optimizer we never even see it:
.
<bb 2> :
_1 = ABSU_EXPR <x_5(D)>;
_2 = ABSU_EXPR <y_7(D)>;
_11 = _1 ^ _2;
_3 = (signed char) _11;
_4 = _3 < 0;
_9 = (int) _4;
return _9;
You will also note in your example the range [0, 128] associated with the
ABSU_EXPR, which includes the 0x80 value, so the sign bit is still included.
Which is why the XOR is VARYING..
If I change the testcase to
int
f1 (signed char x, signed char y)
{
if (x == (signed char)0x80 || y == (signed char)0x80) return 0;
x= __builtin_abs(x);
y= __builtin_abs(y);
return (x < 0) ^ (y < 0);
}
we generate:
<bb 2> [local count: 1073741824]:
return 0;