Upon attempted careful reading of the standard's excerpts quoted by Gabriel Dos Reis per <http://gcc.gnu.org/ml/gcc/2006-12/msg00763.html>, it's not clear that GCC's current presumption of LIA-1 overflow semantics in absents of their true support is actually advocated by the standard.
As by my read, it seems fairly clear that "If an implementation adds support for the LIA-1 exception values ... then those types are LIA-1 conformant types"; implies to me an intent that LIA-1 semantics may be legitimately presumed "if" the semantics are "supported" by a target implementation (just as null pointer optimizations should not be considered legitimate if not correspondingly literally supported by a given target). Which makes sense, as if a target factually supports LIA-1 overflow trapping, then a compiler may "safely" presume the behavior, and thereby leverage it knowing that the target's runtime semantics are preserved; just as a compiler may "safely" presume that wrapping or other overflow semantics for the purpose of optimization for targets which factually "support" those semantics; all of which are legitimate, as the standard defines signed integer overflow semantics as being undefined, and thereby provides implementation's the liberty to augment the language by defining that otherwise being undefined by the standard. However GCC's current predisposition to presume semantics which are known to differ from a target's factual behavior in the name of optimization is likely beyond that intended to be productively enabled by the standard (although arguably perversely legitimate); and should be reconsidered as any optimization which risks altering a program's expressed behavior is most often never desirable (although the diagnosis of behaviors which can't be strictly portably relied upon is most often always desirable).