https://gcc.gnu.org/bugzilla/show_bug.cgi?id=97521

--- Comment #3 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
And we do that because:
    case VECTOR_CST:
      {
        tree tmp = NULL_TREE;
        if (VECTOR_MODE_P (mode))
          return const_vector_from_tree (exp);
        scalar_int_mode int_mode;
        if (is_int_mode (mode, &int_mode))
          {
            if (VECTOR_BOOLEAN_TYPE_P (TREE_TYPE (exp)))
              return const_scalar_mask_from_tree (int_mode, exp);
where the VECTOR_CST type is now a vector boolean with DImode element type and
TImode as the (poor man's) vector mode.

So, the question is how to differentiate between vector booleans that want to
be a bitmask in an integral mode vs. poor man's vector booleans for which we'd
want to fallthru into the VIEW_CONVERT_EXPR code below this.
And what other spots need that.
Perhaps check if the bitsize (or precision?) of the vector type's mode is equal
to bitsize (or precision?) of the element mode times number of elements in the
vector?

Reply via email to