https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71815
--- Comment #5 from Bill Schmidt <wschmidt at gcc dot gnu.org> --- I'll note that in the case where the stride is known (slsr-35.c), SLSR is making at least a somewhat rational decision based on cost not to strength-reduce the phi candidate. In this case the stride is a power of 2, so the benefit of replacing a multiply is pretty low. SLSR would have to replace the multiply with an add, and introduce two adds for the PHI arguments. There is a somewhat simplistic calculation of dead code savings for removal of the PHI itself, in which only PHI arguments that have a single use are considered removable. In this case that is overly conservative because the only other use of _17 is in the definition of _18, which is judged to go dead. If we aggressively assume we would remove the definition of _17 as well, SLSR would make the transformation because the savings would be zero, and we currently do optimize when the result looks neutral. So there is opportunity here to be more intelligent about dead code savings estimation. Processing dependency tree rooted at 1. Conditional candidate 7: add_costs = 12 # Replace mult and phi args mult_savings = 4 # It's a shift, so low cost dead_savings = 4 # Really should be 8 cost = 4 # Really should be 0 Not replaced. # Could replace, but it's on the bubble. However, for the case of an unknown stride as in slsr-36.c, this seems unlikely to be a close decision, so I suspect something worse is wrong there. Will analyze that next.