https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
--- Comment #8 from Martin Jambor ---
(In reply to Jakub Jelinek from comment #7)
> Or find out why SRA doesn't optimize this (remove the useless union, replace
> all the un.value occurrences with a var with Foo type.
IIUC, it just isn't profit
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
Jakub Jelinek changed:
What|Removed |Added
CC||jakub at gcc dot gnu.org,
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
--- Comment #6 from m.cencora at gmail dot com ---
Furthermore in all the scenarios the same function is called, with same
arguments, so the calling convention/ABI is same.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
--- Comment #5 from m.cencora at gmail dot com ---
I've slighlty refactored the code, to remove the auto variables. This issue
remains
#include
inline unsigned deserializeUInt(const unsigned char* &in)
{
unsigned out;
__builtin_memcpy(
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
--- Comment #4 from m.cencora at gmail dot com ---
I dont think ABI is an issue here. The Foo variable is spilled into stack, and
then reloaded back into RDI register before invoking dummyFunc.
Also clang generates optimal code as can be seen her
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
--- Comment #3 from Richard Biener ---
That said, 'auto& arg' might just hide the interesting bit but my C++ fu is too
weak to see how un.value would differ.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
Richard Biener changed:
What|Removed |Added
Keywords||ABI, wrong-code
--- Comment #2 from Ri
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=105260
--- Comment #1 from Richard Biener ---
Well, it simply looks like the calling conventions for dummyFunc are different
with the ABI in effect depending on the PODness(?), and the code is as
optimized as it can be.