https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68378

Jakub Jelinek <jakub at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |hubicka at gcc dot gnu.org,
                   |                            |jakub at gcc dot gnu.org

--- Comment #3 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
Seems the reason why tree NRV does not optimize this is alignment.
DECL_ALIGN of the RESULT_DECL is 32 bits, while x has DECL_ALIGN bumped to 128
bits by:
#0  align_local_variable (decl=<var_decl 0x7ffff7ff9cf0 x>) at
../../gcc/cfgexpand.c:366
#1  0x0000000000854089 in add_stack_var (decl=<var_decl 0x7ffff7ff9cf0 x>) at
../../gcc/cfgexpand.c:439
#2  0x0000000000857cfc in expand_one_var (var=<var_decl 0x7ffff7ff9cf0 x>,
toplevel=true, really_expand=false) at ../../gcc/cfgexpand.c:1625
#3  0x0000000000858805 in estimated_stack_frame_size (node=<cgraph_node*
0x7ffff183a000 "g">) at ../../gcc/cfgexpand.c:1903
#4  0x0000000000af27ee in compute_inline_parameters (node=<cgraph_node*
0x7ffff183a000 "g">, early=true) at ../../gcc/ipa-inline-analysis.c:2915
#5  0x0000000000af2bc7 in compute_inline_parameters_for_current () at
../../gcc/ipa-inline-analysis.c:2982

If I undo that in the debugger during compute_inline_parameters, then NRV
happily optimizes this.
Now, once we somewhere bump DECL_ALIGN of x, we can't be sure if we haven't
used that value in some optimization and thus it might be too late to undo that
at NRV point.
Dunno what to do here though, undoing all DECL_ALIGN changes performed during
compute_inline_parameters (but then for vars that are not going to be NRV
optimized this would be an pessimization), or say undo DECL_ALIGN changes for
vars that could be NRV optimized (appear in a return stmt, perhaps something
more specific)?

Reply via email to