https://gcc.gnu.org/bugzilla/show_bug.cgi?id=99830
--- Comment #14 from Segher Boessenkool <segher at gcc dot gnu.org> --- (In reply to Jakub Jelinek from comment #13) > Seems the exact spot where the clobber is optimized away is e.g. when > simplify_and_const_int_1 (SImode, (ashift:SI (subreg:SI (and:TI (clobber:TI > (const_int 0 [0])) (const_int 255 [0xff])) 0) (const_int 16 [0x10])), 255); > is called. > It calls nonzero_bits, nonzero_bits sees VARYING << 16 and so returns > 0xffff0000, > /* Turn off all bits in the constant that are known to already be zero. > Thus, if the AND isn't needed at all, we will have CONSTOP == > NONZERO_BITS > which is tested below. */ > > constop &= nonzero; > > /* If we don't have any bits left, return zero. */ > if (constop == 0) > return const0_rtx; > > So, are you suggesting that in all such spots we need to test side_effects_p > and punt? Yes, you need to do check side_effects_p *everywhere* you can potentially remove a side effect. This is not specific to combine, even. > Note, simplify_and_const_int_1 already starts with: > if (GET_CODE (varop) == CLOBBER) > return NULL_RTX; > so it would need to use > if (side_effects_p (varop)) > return NULL_RTX; > instead. Yeah. This no longer disallows a VOIDmode clobber, but we should not see those here anyway. You'll need the same change a few lines later, btw: varop = force_to_mode (varop, mode, constop, 0); /* If VAROP is a CLOBBER, we will fail so return it. */ if (GET_CODE (varop) == CLOBBER) return varop; (you only need that second one, even, force_to_mode immediately returns its arg if it is a clobber).