https://gcc.gnu.org/bugzilla/show_bug.cgi?id=96921
Andrew Pinski <pinskia at gcc dot gnu.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
Status|NEW |ASSIGNED
Assignee|unassigned at gcc dot gnu.org |pinskia at gcc dot
gnu.org
--- Comment #3 from Andrew Pinski <pinskia at gcc dot gnu.org> ---
I was thinking if we see:
_2 = 1 - _1;
and _1 has a range of [0,1] aka boolean
turn it into _1 ^ 1
There is already a pattern which turns ((int)a)^1 into (int)(~a).
You can see the other patterns in action if you write the code as:
int
foo1 (_Bool a, _Bool b)
{
int c = a;
c ^= 1;
int d = b;
d = d ^ 1;
int e = c & d;
return e ^ 1;
}
So something like:
(simplify
(minus integer_one@0 SSA_NAME@1)
(if (TREE_CODE (@0) == SSA_NAME
&& ssa_name_has_boolean_range (@0))
(bit_xor @1 @0)))
That is for boolean types/ranges, we always use a ^ 1 (or ~a) instead of 1 - a
Take:
int
fooneg (_Bool a)
{
return 1 - a;
}
int
fooxor (_Bool a)
{
return a^1;
}