https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113459
--- Comment #4 from rguenther at suse dot de <rguenther at suse dot de> --- On Thu, 18 Jan 2024, jakub at gcc dot gnu.org wrote: > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113459 > > --- Comment #3 from Jakub Jelinek <jakub at gcc dot gnu.org> --- > (In reply to Richard Biener from comment #2) > > unsigned buflen = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (vr->type)) > > + 1; > > if (INTEGRAL_TYPE_P (vr->type)) > > buflen = GET_MODE_SIZE (SCALAR_INT_TYPE_MODE (vr->type)) + > > 1; > > > > there's other spots using the pattern > > > > if (INTEGRAL_TYPE_P (type)) > > sz = GET_MODE_SIZE (SCALAR_INT_TYPE_MODE (type)); > > > > I wonder when GET_MODE_SIZE differs from TYPE_SIZE_UNIT? PSImode? > > I'm afraid I don't know either. > > > Packed bitfields? r10-6885-g5f9cd512c42786 added the INTEGRAL_TYPE_P > > special-casing we now run into with the TYPE_SIZE_UNIT code being there > > before. > > > > Jakub, do you remember? > > I bet the above comes from what the native_{encode,interpret}_int has been > doing (and which has been tweaked for BITINT_TYPE). > If we don't want to throw it away, we could just change it to > if (INTEGRAL_TYPE_P (...) && TYPE_MODE (...) != BLKmode) Yeah, that might work. There are a few occurances in tree-ssa-sccvn.cc that need such adjustment.