https://gcc.gnu.org/bugzilla/show_bug.cgi?id=116942

--- Comment #3 from Jonathan Wakely <redi at gcc dot gnu.org> ---
The modified ECMAScript grammar in the C++ standard says:

"If the CV of a UnicodeEscapeSequence is greater than the largest value that
can be held in an object of type charT the translator shall throw an exception
object of type regex_error."

But that only means that when CHAR_MAX is 0x7f we can't use "\u0080", it
doesn't help for "\x80" which ECMAScript says should mean the same thing. And
as I said above, when char is unsigned this means "\u0080" is valid, but still
doesn't actually work correctly.

(You should just use Boost.Regex)

Reply via email to