------- Comment #5 from rguenth at gcc dot gnu dot org 2010-04-08 08:39 ------- The middle-end trusts the C++ frontends TYPE_MIN/MAX_VALUE which is according to the C++ standard the range for an (unsigned) integer type with a precision just as wide as to represent all values in the enum.
I did want to remove that optimization at some point but people complained that it was highly useful (the C++ FE can still set TYPE_PRECISION accordingly if it wants to, but TYPE_MIN/MAX_VALUE cannot really be trusted in the middle-end as conversions between integer types of same precision but different TYPE_MIN/MAX_VALUE are 1) useless and 2) do not produce code to truncate the values). It's VRP btw, that does this optimization. If I have your support I'll rip out this optimization from VRP ... (I didn't yet manage to create a wrong-code bug due to the useless conversion thing, but I believe that is possible). I also believe that the TYPE_MIN/MAX_VALUE on enums might be needed for proper debug information. -- rguenth at gcc dot gnu dot org changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |rguenth at gcc dot gnu dot | |org http://gcc.gnu.org/bugzilla/show_bug.cgi?id=43680