> > | > If the above is the only without Autoconf change, I would highly > | > recommend Autoconf change if GCC optimizers highly value benchmarks > | > over running real world code. > | > | Which one, mine or Paul's? > > If what you propose is the only way out, and there is no way to make > GCC optimizers reasonable, then I believe Paul's proposal is the next > option.
But that still does not address the issue is that this is not just about GCC any more since autoconf can be used many different compilers and is right now. So if you change autoconf to default to -fwrapv and someone comes alongs and tries to use it with say ACC (some made up compiler right now). The loop goes into an infinite loop because they treat (like GCC did) signed type overflow as undefined, autoconf still becomes an issue. If you want to make the code more readable and maintainable, you can use macros like: MAX_USING_TYPE(type, othertype, max) \ { \ ... \ max = (othertype) _real_max; \ } MAX_TYPE(type, max) \ { \ if (sizeof(type)==sizeof(unsigned int))\ MAX_USING_TYPE(unsigned int, type, max); else if (..... \ else \ { printf("Need another integeral type sized, %d\n", sizeof(type)); \ abort (); \ } \ } Yes people think macros can be less reabable, but this one case actually makes it readable. Thanks, Andrew Pinski