On Fri, Nov 05, 2021 at 04:11:36PM +0000, Qing Zhao wrote: > 3076 if (TREE_CODE (TREE_TYPE (lhs)) != BOOLEAN_TYPE > 3077 && tree_fits_uhwi_p (var_size) > 3078 && (init_type == AUTO_INIT_PATTERN > 3079 || !is_gimple_reg_type (var_type)) > 3080 && int_mode_for_size (tree_to_uhwi (var_size) * BITS_PER_UNIT, > 3081 0).exists ()) > 3082 { > 3083 unsigned HOST_WIDE_INT total_bytes = tree_to_uhwi (var_size); > 3084 unsigned char *buf = (unsigned char *) xmalloc (total_bytes); > 3085 memset (buf, (init_type == AUTO_INIT_PATTERN > 3086 ? INIT_PATTERN_VALUE : 0), total_bytes); > 3087 tree itype = build_nonstandard_integer_type > 3088 (total_bytes * BITS_PER_UNIT, 1); > > The exact failing point is at function > “set_min_and_max_values_for_integral_type”: > > 2851 gcc_assert (precision <= WIDE_INT_MAX_PRECISION); > > For _Complex long double, “precision” is 256. > In GCC11, “WIDE_INT_MAX_PRECISION” is 192, in GCC12, it’s 512. > As a result, the above assertion failed on GCC11. > > I am wondering what’s the best fix for this issue in gcc11?
Even for gcc 12 the above is wrong, you can't blindly assume that build_nonstandard_integer_type will work for arbitrary precisions, and even if it works that it will actually work. The fact that such a mode exist is one thing, but targetm.scalar_mode_supported_p should be tested for whether the mode is actually supported. Jakub