I'm not sure if this is the right place to send this bug to. I sent this bug to Apple on March 2nd, 2006, but the bug is still open in their database as of today (May 17th).
Here's the description: I've noticed a different behavior of my application depending on whether I turned optimization on or off in XCode 2.2 using gcc-4.0.1. I managed to make the code as simple as possible to still reproduce the problem. With the following code, in a standard C++ tool project in XCode, I get a result of 1 without optimization (expected result), but a result of 0 with optimization -O1 (bad result). I'm running on a dual G5 PPC. I'm compiling for PPC. #include <iostream> static bool func2(bool bool1, bool bool2) { if (bool1 == bool2) return true; else return false; } bool func1(unsigned long arg1, unsigned long arg2) { return (func2(!(arg1 & 1), !(arg2 & 1)) && !(arg1 & 1)); } int main (int argc, char * const argv[]) { bool result = func1(0, 0); std::cout << "result = " << result << "\n"; return 0; } -- Summary: Different results depending on optimization level with gcc-4.0.1 Product: gcc Version: 4.0.1 Status: UNCONFIRMED Severity: normal Priority: P3 Component: c++ AssignedTo: unassigned at gcc dot gnu dot org ReportedBy: scharest at druide dot com http://gcc.gnu.org/bugzilla/show_bug.cgi?id=27647