http://gcc.gnu.org/bugzilla/show_bug.cgi?id=55145
--- Comment #8 from Vincent Lefèvre <vincent-gcc at vinc17 dot net> 2012-11-04 23:43:44 UTC --- (In reply to comment #7) > Here are different internal values from the same input: > > 32-bit long: 1.57079632679489661925640447970309310221637133509 > Input: > 1.5707963267948966192021943710788178805159986950457096099853515625 > 64-bit long: 1.57079632679489661914798426245454265881562605500221252441 > > Input value is extremely close to a half-way value between 32-bit > and 64-bit longs. 1.5707963267948966192021943710788178805159986950457096099853515625 is *exactly* the 65-bit binary number 1.1001001000011111101101010100010001000010110100011000010001101001, thus exactly a halfway value between two consecutive long double numbers (for 64-bit precision): 1.100100100001111110110101010001000100001011010001100001000110100 and 1.100100100001111110110101010001000100001011010001100001000110101 I suppose that the difference is due to the fact that the algorithm used in GCC has not been written to round correctly, and if this algorithm uses variables of type "long" internally, it is not surprising to get different results on different architectures (32-bit long and 64-bit long).