https://gcc.gnu.org/bugzilla/show_bug.cgi?id=121192

Richard Biener <rguenth at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
     Ever confirmed|0                           |1
             Status|UNCONFIRMED                 |NEW
                 CC|                            |rguenth at gcc dot gnu.org
   Last reconfirmed|                            |2025-07-21

--- Comment #2 from Richard Biener <rguenth at gcc dot gnu.org> ---
Hmm, so we turn this into

if (-(unsigned)({-1, -1}[0] == 0) != 0)
  abort ()

so the change of ~ to - isn't correct (this is pr110817-1.c).

On GENERIC we have

   V v =  VEC_COND_EXPR < <<< Unknown tree: compound_literal_expr
        V D.2272 = { 0, 0 }; >>> == { 0, 0 } , { 0, 0 } , { 4294967295,
4294967295 } > ;

so the bitwise not got elided already via

Applying pattern match.pd:5981, generic-match-3.cc:5023

before vector lowering it's correct:

  <bb 2> :
  _1 = { -1, -1 };
  _2 = VEC_COND_EXPR <_1, { 0, 0 }, { 4294967295, 4294967295 }>;
  v = _2;
  _3 = BIT_FIELD_REF <v, 32, 0>;
  if (_3 != 0)
    goto <bb 3>; [INV]
  else
    goto <bb 4>; [INV]

  <bb 3> :
  __builtin_abort ();

but vector lowering makes a mess out of it.  In particular we apply

Applying pattern match.pd:7027, gimple-match-6.cc:6659
Applying pattern match.pd:1685, gimple-match-1.cc:10846
Applying pattern match.pd:6272, gimple-match-1.cc:24830

(simplify
 (cond @0 INTEGER_CST@1 INTEGER_CST@2)
 (switch
...
  (if (integer_zerop (@1))
   (switch
    /* a ? 0 : 1 -> !a.  */
    (if (integer_onep (@2))
     (convert (bit_xor (convert:boolean_type_node @0) { boolean_true_node; })))
    /* a ? 0 : -1 -> -(!a).  */
    (if (INTEGRAL_TYPE_P (type) && integer_all_onesp (@2))
     (if (TYPE_PRECISION (type) == 1)
      /* For signed 1-bit precision just cast bool to the type.  */
      (convert (bit_xor (convert:boolean_type_node @0) { boolean_true_node; }))
      (if (TREE_CODE (type) == BOOLEAN_TYPE)
       (with {
          tree intt = build_nonstandard_integer_type (TYPE_PRECISION (type),
                                                      TYPE_UNSIGNED (type));
        }
        (convert (negate (convert:intt (bit_xor (convert:boolean_type_node @0)
                                                { boolean_true_node; })))))
       (negate (convert:type (bit_xor (convert:boolean_type_node @0)
                                      { boolean_true_node; }))))))

but this seems to confuse us.  We try to generate

 -(signed-boolean:32)((bool)({-1,-1}[0] != 0) ^ (bool)1)

and things go downhill from there with the X ^ ~0 -> ~X and
~(x != 0) -> x == 0 patterns.

One needs to see what vector lowering produces initially to decide whether
the issue is there or in the patterns above.

Reply via email to