https://gcc.gnu.org/bugzilla/show_bug.cgi?id=117360

Jakub Jelinek <jakub at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           Priority|P3                          |P1

--- Comment #5 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
That commit looks just weird.
carry_backpropagate starts with
  enum machine_mode mode = GET_MODE_INNER (GET_MODE (x));
  unsigned HOST_WIDE_INT mmask = GET_MODE_MASK (mode);
...
  scalar_int_mode smode;
  if (!is_a <scalar_int_mode> (mode, &smode)
      || GET_MODE_BITSIZE (smode) > HOST_BITS_PER_WIDE_INT)
    return mmask;
So, first of all, I don't see why
known_lt (UINTVAL (XEXP (x, 1)), GET_MODE_BITSIZE (mode))
and not
UINTVAL (XEXP (x, 1)) < GET_MODE_BITSIZE (smode)
and
GET_MODE_BITSIZE (mode).to_constant ()
instead of
GET_MODE_BITSIZE (smode)
etc.
But then comes the SIGN_EXTEND/ZERO_EXTEND case, and those have the
      if (!GET_MODE_BITSIZE (GET_MODE (x)).is_constant ()
          || !GET_MODE_BITSIZE (GET_MODE (XEXP (x, 0))).is_constant ())
        return -1;
which I don't understand at all, if we handle vectors by just looking at the
element modes, why do we care if it is a variable length vector or not?
In any case, that
      mode = GET_MODE (XEXP (x, 0));
      if (mask & ~GET_MODE_MASK (GET_MODE_INNER (mode)))
        mask |= 1ULL << (GET_MODE_BITSIZE (mode).to_constant () - 1);
IMHO should just be
      mode = GET_MODE_INNER (GET_MODE (XEXP (x, 0)));
      if (mask & ~GET_MODE_MASK (mode))
        mask |= 1ULL << (GET_MODE_BITSIZE (mode).to_constant () - 1);

Reply via email to