On Jun 1, 2022, Richard Biener <richard.guent...@gmail.com> wrote: > On Tue, May 31, 2022 at 3:27 PM Alexandre Oliva <ol...@adacore.com> wrote: > int i; > if (flag) > i = init; > i++;
> still would (the propagation stops at i++). Uh, are you sure? That doesn't sound right. I meant for the propagation to affect the incremented i as well, and any users thereof: the incremented i is maybe-undefined, since its computation involves another maybe-undefined value. > do we understand the failure mode good enough to say that > a must-def (even if from an SSA name without a definition) is good > enough to avoid the issues we are seeing? A must-def computed from a maybe-undef doesn't protect us from the failure. I assumed the failure mode was understood well enough to make directly-visible undefined SSA_NAMEs problematic, and, given the finding that indirectly-visible ones were also problematic, even with multiple-steps removed, I figured other optimizations such as jump threading could turn indirectly-visible undef names into directly-visible ones, or that other passes could take advantage of indirectly-visible undefinedness leading to potential undefined behavior, to the point that we ought to avoid that. > One would argue that my example above invokes undefined behavior > if (!flag), but IIRC the cases in the PRs we talk about are not but > IVOPTs with its IV choice exposes undefined behavior - orignially > by relying on undef - undef being zero. *nod*. Perhaps we could refrain from propagating through assignments, like the i++ increment above, rather than PHIs after the conditional increment in my modified testcase, on the grounds that, if such non-PHI assignments exercised, then we can assume any of its operands are defined, because otherwise undefined behavior would have been invoked. I.e., at least for ivopts purposes, we could propagate maybe-undefined down PHI nodes only. > That said, the contains-undef thing tries to avoid rewriting expressions > with terms that possibly contain undefs which means if we want to > strenthen it then we look for must-defined (currently it's must-undefined)? I went for must-defined in the patch, but by computing its negated form, maybe-undefined. Now I'm thinking we can go for an even stricter predicate to disable the optimization: if a non-PHI use of a maybe-undefined dominates the loop, then we can still perform the optimization: >> >> + int g; >> >> + if (f) >> >> + g++, b = 40; y = g+2; [loop] The mere use of g before the loop requires it to be defined (even though that's impossible above: either path carries the uninitialized value, incremented or not), so we can proceed with the optimization under the assumption that, after undefined behavior, anything goes. This would be more useful for the case you showed, in which there's a conditional initialization followed by an unconditional use. The requirement of no undefined behavior implies the conditional guarding the initializer must hold, so the optimization can be performed. WDYT? -- Alexandre Oliva, happy hacker https://FSFLA.org/blogs/lxo/ Free Software Activist GNU Toolchain Engineer Disinformation flourishes because many people care deeply about injustice but very few check the facts. Ask me about <https://stallmansupport.org>