https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91555

            Bug ID: 91555
           Summary: [9.2 regression] Optimizer bug
           Product: gcc
           Version: 9.2.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: tree-optimization
          Assignee: unassigned at gcc dot gnu.org
          Reporter: skunk at iskunk dot org
  Target Milestone: ---

Created attachment 46761
  --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=46761&action=edit
Quasi-minimal test program

I have encountered what appears to be a bug in the optimizer of GCC 9.2.0. (I
am unclear as to whether it is the tree optimizer or RTL optimizer that is at
issue; please recategorize as appropriate.)

The problem occurs inside some timezone-handling code that is part of a larger
project. I have extracted the code into a program that may not be a minimal
test case, but nevertheless exhibits the problem.

This incorrect optimization has been observed in x86-64 builds on both Linux
and Solaris.

Correct result:

    $ gcc -O1 gcc9-opt-bug.c -o bug
    $ ./bug
    result: t = 1009489458 (correct)

Wrong result:

    $ gcc -O2 gcc9-opt-bug.c -o bug
    $ ./bug
    WRONG 13
    result: t = 18446744073709551615

Workaround using "volatile" keyword:

    $ gcc -O2 -DUSE_VOLATILE gcc9-opt-bug.c -o bug
    $ ./bug
    result: t = 1009489458 (correct)

This code was correctly optimized by GCC 4.8.1, so this appears to be a
regression.

Reply via email to