https://gcc.gnu.org/bugzilla/show_bug.cgi?id=109093
--- Comment #18 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
I'd like to understand what is the difference between those .DEFERRED_INITs on
unused vars and normal vars.
If I try
typedef int V __attribute__((vector_size (16)));
__attribute__((noipa)) void
foo ()
{
V v;
V *p = &v;
asm ("" : "+r" (p));
*p = (V) {};
asm volatile ("" : : : "r12", "r13", "r14");
}
__attribute__((noipa)) void
bar ()
{
V v;
V *p = &v;
asm ("" : "+r" (p));
*p = (V) {};
asm volatile ("" : : : "r12", "r13");
}
int
main ()
{
foo ();
bar ();
}
with either -O2 -mavx or -O2 -mavx -fno-omit-frame-pointer, then everything is
properly aligned.