This C++ test case passes a bitfield value to a function which expects a const reference. When optimizing, the function gets the wrong value: it gets the value 0x1000000, which is an acceptable value for a 24-bit bitfield, but is the wrong value for extending the bitfield out to a 32-bit integer. This failure happens on i686-pc-linux-gnu with -O2. The test passes without optimization.
extern "C" void abort() __attribute__ ((noreturn)); template<typename t1, typename t2> void fn(const t1&, const t2&) __attribute__ ((noinline)); template<typename t1, typename t2> void fn(const t1& v1, const t2& v2) { if (v1 != v2) abort(); } struct s { unsigned long long f1 : 40; unsigned int f2 : 24; }; s sv; int main() { sv.f1 = 0; sv.f2 = (1 << 24) - 1; fn(sv.f1, 0); fn(sv.f2, (1 << 24) - 1); ++sv.f2; fn(sv.f2, 0); return 0; } -- Summary: Reference to bitfield gets wrong value when optimizing Product: gcc Version: 4.2.0 Status: UNCONFIRMED Severity: normal Priority: P3 Component: c++ AssignedTo: unassigned at gcc dot gnu dot org ReportedBy: ian at airs dot com http://gcc.gnu.org/bugzilla/show_bug.cgi?id=33887