https://gcc.gnu.org/bugzilla/show_bug.cgi?id=111800
Jakub Jelinek <jakub at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |rsandifo at gcc dot gnu.org --- Comment #6 from Jakub Jelinek <jakub at gcc dot gnu.org> --- Ah, I see what's going on. tree-ssa-ccp.cc uses widest_int extensively (I believe it is a mistake and should be using wide_int instead) and in this case has a widest_int with value -30 (get_len () == 1, get_precision () before my changes 576, with those changes 32640 bits). Now, print_hex (but also print_decu/print_decs/print_dec as they always print larger precision (and when get_len () > 1) using print_hex) prints all numbers according to the precision, so my checks to ensure the buffer is big enough: char buf[WIDE_INT_PRINT_BUFFER_SIZE], *p = buf; unsigned len = wi.get_len (); if (UNLIKELY (len > WIDE_INT_MAX_INL_ELTS)) p = XALLOCAVEC (char, len * HOST_BITS_PER_WIDE_INT / 4 + 4); are not correct, they'd need to check precision for wi::neg_p (wi) values.