[Bug preprocessor/53690] [C++11] \u0000 and \U00000000 are wrongly encoded as U+0001.

2016-01-24 Thread wjl at icecavern dot net
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=53690 --- Comment #13 from Wesley J. Landaker --- However, it is fixed the 6.0 preview version, which is good! $ g++ --version g++ (Debian 6-20160117-1) 6.0.0 20160117 (experimental) [trunk revision 232481] Copyright (C) 2016 Free Software Foundation,

[Bug preprocessor/53690] [C++11] \u0000 and \U00000000 are wrongly encoded as U+0001.

2016-01-24 Thread wjl at icecavern dot net
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=53690 --- Comment #12 from Wesley J. Landaker --- This bug is marked fixed, but it is still present in g++ 5.3.1. $ g++ --version g++ (Debian 5.3.1-4) 5.3.1 20151219 Copyright (C) 2015 Free Software Foundation, Inc. This is free software; see the sour

[Bug preprocessor/53690] [C++11] \u0000 and \U00000000 are wrongly encoded as U+0001.

2015-06-21 Thread wjl at icecavern dot net
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=53690 --- Comment #8 from Wesley J. Landaker --- This major bug -- with security implications -- is still present in GCC 5.1.1. $ g++ --version g++ (Debian 5.1.1-20) 5.1.1 20150616 Copyright (C) 2015 Free Software Foundation, Inc. This is free softwar

[Bug c++/60397] The value of char16_t u'\uffff' is 0xdfff instead of 0xffff

2015-06-21 Thread wjl at icecavern dot net
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=60397 Wesley J. Landaker changed: What|Removed |Added Status|UNCONFIRMED |RESOLVED Known to work|

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-03-02 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 --- Comment #10 from Wesley J. Landaker --- Created attachment 32248 --> http://gcc.gnu.org/bugzilla/attachment.cgi?id=32248&action=edit u.c++ -- program that shows this problem, and bug #60397's problem I opened new bug #60397 that is a si

[Bug c++/60397] New: The value of char16_t u'\uffff' is 0xdfff instead of 0xffff

2014-03-02 Thread wjl at icecavern dot net
ity: major Priority: P3 Component: c++ Assignee: unassigned at gcc dot gnu.org Reporter: wjl at icecavern dot net CC: daniel.kruegler at googlemail dot com Created attachment 32247 --> http://gcc.gnu.org/bugzilla/attachment.cgi?id=32247&ac

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-19 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 --- Comment #9 from Wesley J. Landaker --- This also happens in strings, e.g.: static_assert(U"\u"[0] == 1, "this passes"); static_assert(U"\u"[0] == 0, "this fails");

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-19 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 --- Comment #8 from Wesley J. Landaker --- Just as an additional point, L'\u' also yields a wchar_t with the value of 1. (If that is an illegal construct, it is not warned about when using -Wall -Wextra -Werror).

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-18 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 Wesley J. Landaker changed: What|Removed |Added See Also||http://llvm.org/bugs/show_b

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-18 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 --- Comment #5 from Wesley J. Landaker --- (In reply to Marc Glisse from comment #4) > Seems to be on purpose, see the comment before _cpp_valid_ucn in > libcpp/charset.c, and the last instruction in that function. > > [lex.charset] is a bit hard

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-18 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 --- Comment #3 from Wesley J. Landaker --- Created attachment 31887 --> http://gcc.gnu.org/bugzilla/attachment.cgi?id=31887&action=edit A truncated version of char32_literal_test.c++ I also made another program that tests ALL possible char32_t

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-18 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 --- Comment #2 from Wesley J. Landaker --- Created attachment 31886 --> http://gcc.gnu.org/bugzilla/attachment.cgi?id=31886&action=edit The test.c++ program shown in the bug For convenience, here is the test.c++ program as an attachment (same e

[Bug c++/59873] The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-18 Thread wjl at icecavern dot net
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59873 Wesley J. Landaker changed: What|Removed |Added Version|4.8.3 |4.9.0 --- Comment #1 from Wesley J.

[Bug c++/59873] New: The value of char32_t U'\u0000' and char16_t u'\u000' is 1, instead of 0.

2014-01-18 Thread wjl at icecavern dot net
Severity: major Priority: P3 Component: c++ Assignee: unassigned at gcc dot gnu.org Reporter: wjl at icecavern dot net I found a major bug with char32_t and char16_t literals when trying to encode a U+ (Null). The following expressions have the nume