Vincent Lefevre wrote:
BTW, unpredictability, such as in bug 323, is not a bug (according to the C standard). This may be seen as a bad behavior and changing this behavior would be a great improvement, but I don't complain about it here when saying "bug".
Everyone would agree that per se unnecessary non-determinism is a bad thing. However, most people would also agree that poor performance is a bad thing. All this stuff about allowing extra precision is actually about allowing efficient code. Lacking in this discussion is a good quantitative measurement over a reasonable set of benchmarks as to what eliminating the excess precision in all cases would cost. Note that just setting the precision to 64-bits is not enough if you agree with Vincent that 32-bit float variables have to be normalized on every assignment. Data would help. Almost everyone will agree with eliminating the extra precision if it has only a 3% impact, almost everyone will disagree if it doubles execution time (I suspect the asnwer is in that range :-)