https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66110

--- Comment #14 from Kevin OConnor <kevin at koconnor dot net> ---
(In reply to jos...@codesourcery.com from comment #13)
> I concur that it would be valid to define those typedefs to be extended 
> integer types withing the special aliasing properties.  The first 
> suggestion of that on the GCC lists that I know of is 
> <https://gcc.gnu.org/ml/gcc/2000-05/msg01106.html>.  However, it was noted 
> in <https://gcc.gnu.org/ml/gcc/2000-07/msg00155.html> that there would be 
> C++ issues.  I see that C++ does now have extended integer types, which it 
> didn't at that time, but there would still be questions of mangling 
> (changing typedefs from standard headers breaks the library ABI for 
> anything using those types in C++ interfaces, because of changed 
> mangling).  And of course the question of what expectations might be 
> broken in C by making these typedefs into extended integer types.

Many thanks.  Those links and the background information is quite helpful.  I
agree that breaking the C++ ABI would not make sense on the major platforms.

> Cf <https://gcc.gnu.org/ml/gcc-patches/2002-01/msg01941.html> implementing 
> a char_no_alias_everything attribute.

I do think having something like "char_no_alias_everything" would be useful. 
My interest in this discussion was due to the code generation on the embedded
AVR platform (where 8-bit integers are very common).  If non-aliasing 8-bit
integer types existed, I suspect dealing with ABI breakage would be more
tenable on AVR.  (And if nothing else, utilizing them within an individual AVR
project would not be difficult.)

Reply via email to