https://gcc.gnu.org/bugzilla/show_bug.cgi?id=99824

--- Comment #4 from Richard Biener <rguenth at gcc dot gnu.org> ---
OK, so I guess that things go wrong in wi::min_value where we nowhere check
that the precision we're asking for (384) fits in a wide_int,
WIDE_INT_MAX_PRECISION should be 160 (MAX_BITSIZE_MODE_ANY_INT) rounded up
to 64bits thus 192.  That then clobbers the stack somewhere and things go
wrong.

This is likely uncovered by the PR98834 "fix"
g592388d4f6e8a6adb470428fef6195694f4a3dce (but this fix shouldn't expose
such large precisions by itself)

I still like to reproduce it to see what events lead to such large requested
precision.  On trunk sth like

diff --git a/gcc/stor-layout.c b/gcc/stor-layout.c
index 784f131ebb8..94b8b21c7a8 100644
--- a/gcc/stor-layout.c
+++ b/gcc/stor-layout.c
@@ -2838,6 +2838,8 @@ set_min_and_max_values_for_integral_type (tree type,
   if (precision < 1)
     return;

+  gcc_assert (precision <= WIDE_INT_MAX_PRECISION);
+
   TYPE_MIN_VALUE (type)
     = wide_int_to_tree (type, wi::min_value (precision, sgn));
   TYPE_MAX_VALUE (type)

should uncover any similar issue and eventually allow producing smaller
testcases.

Reply via email to