"H. J. Lu" <[EMAIL PROTECTED]> writes: > It looks like a gcc bug to me. Gcc 4.2 miscompiles: > > more_than_enough_bits_for_digits > = (number_of_digits_to_use * 3321928 / 1000000 + 1); > > in atof_generic. When number_of_digits_to_use == 1, gcc 4.2 -O2 gets > more_than_enough_bits_for_digits as 4. The correct value is 37.
Why do you think 3321928 / 1000000 == 36? Andreas. -- Andreas Schwab, SuSE Labs, [EMAIL PROTECTED] SuSE Linux Products GmbH, Maxfeldstraße 5, 90409 Nürnberg, Germany PGP key fingerprint = 58CA 54C7 6D53 942B 1756 01D3 44D5 214B 8276 4ED5 "And now for something completely different."