https://gcc.gnu.org/bugzilla/show_bug.cgi?id=97976
Bug ID: 97976
Summary: Optimization regression in 10.1 for lambda passed as
template argument
Product: gcc
Version: 10.0
Status: UNCONFIRMED
Severity: normal
Priority: P3
Component: c++
Assignee: unassigned at gcc dot gnu.org
Reporter: peter at int19h dot net
Target Milestone: ---
The following C++11 and above code:
////////////////////
extern const int* data;
template<typename T> bool func(T callback)
{
for (const int* pi = data; pi; ++pi)
{
if (callback(*pi))
{
return false;
}
}
return true;
}
bool f0(int i)
{
return func([i](const int j){ return i == j; });
}
////////////////////
With GCC 10.1 with "-std=c++11 -O2" flags generates the following:
////////////////////
f0(int):
cmp QWORD PTR data[rip], 0
sete al
ret
////////////////////
While GCC 9.3 with the same command line flags generates the following:
////////////////////
f0(int):
mov rax, QWORD PTR data[rip]
test rax, rax
jne .L3
jmp .L4
.L7:
add rax, 4
.L3:
cmp edi, DWORD PTR [rax]
jne .L7
xor eax, eax
ret
.L4:
mov eax, 1
ret
////////////////////
It looks like this regression started with GCC 10 and starts at -02
optimization level for C++11 and above. I have tested this with clang and msvc,
and they generate code similar to what is generated by gcc 9.3.
This behavior can also be seen in the Compiler Explorer here:
https://godbolt.org/z/r4zMnc
Thank you!
--peter