http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52119
ajf <alan.j.flavell at gmail dot com> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |alan.j.flavell at gmail dot com --- Comment #10 from ajf <alan.j.flavell at gmail dot com> --- (In reply to Jeffrey Yasskin from comment #0) > numeric_limits<T>::min() is defined as (__glibcxx_signed (T) ? (T)1 << > __glibcxx_digits (T) : (T)0). Unfortunately, shifting into the sign bit is > undefined behavior (C++11[expr.shift]p2), and undefined behavior makes an > expression non-constant. MY apologies if I am missing something obvious, but what exactly makes shifting a 1 into the sign bit undefined behavior? Seems rather defined to me, assuming it's actually being using as a integral numerical data-type (in this case, 32-bit int, long), which would result in a signed numerical value.In other words, the sign bit, like any other bit, can only be 0 or 1, so I honestly do not understand how undefined behavior enters the equation, or what impact it may have.