https://gcc.gnu.org/bugzilla/show_bug.cgi?id=121894
Richard Biener <rguenth at gcc dot gnu.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
Keywords| |missed-optimization
--- Comment #3 from Richard Biener <rguenth at gcc dot gnu.org> ---
So
s = .DEFERRED_INIT (16, 1, &"s"[0]);
_1 = s.b;
is OK to "CSE" to
_1 = .DEFERRED_INIT (4, 1, <the-val>);
? But this is likely more costly, so only SRA should do this?
Is it only -ftrivial-auto-var-init=pattern that is an issue?