https://gcc.gnu.org/bugzilla/show_bug.cgi?id=111694

            Bug ID: 111694
           Summary: Wrong behavior for signbit of negative zero when
                    optimizing
           Product: gcc
           Version: 13.2.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: web
          Assignee: unassigned at gcc dot gnu.org
          Reporter: alonzakai at gmail dot com
  Target Milestone: ---

This behaves incorrectly in 13.2.0 with -O1 and above, but is correct in 12,
11, and 10. It is also correct in 13.2.0 without optimizations.

Testcase:


#include <math.h>
#include <stdio.h>

void test(double l, double r) {
  if (l == r && l == 0 && (signbit(l) || signbit(r))) {
    puts("one is negative");
  }
}

int main() {
  test(0.0, -0.0);
  test(-0.0, 0.0);
}


This should print "one is negative" twice, but only does so once in 13.2.0 with
-O1:

$ gcc-13 a.c -O1 ; ./a.out
one is negative
$ gcc-13 a.c -O0 ; ./a.out
one is negative
one is negative
$

Reply via email to