Hi, there is a regression in RISC-V caused by this patch:
FAIL: gcc.dg/vect/pr111754.c -flto -ffat-lto-objects scan-tree-dump optimized
"return { 0.0, 9.0e\\+0, 0.0, 0.0 }"
FAIL: gcc.dg/vect/pr111754.c scan-tree-dump optimized "return { 0.0, 9.0e\\+0,
0.0, 0.0 }"
I have checked the dump is :
F foo (F a, F b)
{
<bb 2> [local count: 1073741824]:
<retval> = { 0.0, 9.0e+0, 0.0, 0.0 };
return <retval>;
}
The dump IR seems reasonable to me.
I wonder whether we should walk around in RISC-V backend to generate the same
IR as ARM SVE ?
Or we should adjust the test ?
Thanks.
[email protected]