[Bug libstdc++/26697] New: time_put::put('x') shows only 2 year digits, in en_GB locale.
As seen in the C++ test case here: http://bugzilla.gnome.org/show_bug.cgi?id=334648 std::time_put<>::put(), with format 'x", shows only the last 2 digits of the year, and there is no short-date format that shows all 4 digits. I notice that the en_US and de_DE locales do not have this problem. This is Ubuntu Linux's Dapper release (currently unstable): [EMAIL PROTECTED]:~$ g++ --version g++ (GCC) 4.0.3 (Ubuntu 4.0.3-1ubuntu1) Copyright (C) 2006 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. -- Summary: time_put::put('x') shows only 2 year digits, in en_GB locale. Product: gcc Version: 4.0.3 Status: UNCONFIRMED Severity: normal Priority: P3 Component: libstdc++ AssignedTo: unassigned at gcc dot gnu dot org ReportedBy: murrayc at murrayc dot com http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26697
[Bug libstdc++/26697] time_put::put('x') shows only 2 year digits, in en_GB locale.
--- Comment #2 from murrayc at murrayc dot com 2006-03-15 17:10 --- Thanks. So, can't we just reassign this to glibc then? -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26697
[Bug libstdc++/26697] time_put::put('x') shows only 2 year digits, in en_GB locale.
--- Comment #3 from murrayc at murrayc dot com 2006-03-15 17:11 --- Oh, and before I submit a time_get function, do we use a glibc function for time_get too? -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26697
[Bug libstdc++/26697] time_put::put('x') shows only 2 year digits, in en_GB locale.
--- Comment #5 from murrayc at murrayc dot com 2006-03-15 17:20 --- > I mean, what evidence do you have that the en_GB locale data should be > different? I see no reason why en_GB should want to show 2 year digits, while en_US and en_DE should be happy with 4. 2 digits lead to confusion because it is a loss of data. -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26697
[Bug libstdc++/26697] time_put::put('x') shows only 2 year digits, in en_GB locale.
--- Comment #7 from murrayc at murrayc dot com 2006-03-15 17:33 --- Maybe we can just call it an enhancement or improvement then? (Thought I strongly feel that any date meant for humans to read must have 4 year digits and any use of 2 year digits for a human to read should be a bug, even if that human is using the C locale.) -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26697
[Bug libstdc++/26697] time_put::put('x') shows only 2 year digits, in en_GB locale.
--- Comment #9 from murrayc at murrayc dot com 2006-03-15 17:43 --- > By the way, can't you just prepare a small function consistently using 'Y', just a few lines, after all... I need to support all locales. I may have to special-case en_GB. -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26697
[Bug libstdc++/26701] New: std::time_get parses only 2 digits of year, in en_GB locale.
The attached test case shows that std::time_get<> parses only the first 2 digits of years, and assumes that those are years in the 1900s. This is a loss of data. It gives this output, in an en_GB locale: [EMAIL PROTECTED]:~$ ./a.out The date as text: 01/02/2003 std::time_get result: day=1, month=2, year=1920 Maybe time_get tries to use the strftime date format only, which uses 2 digits only for display in en_GB. [EMAIL PROTECTED]:~$ g++ --version g++ (GCC) 4.0.3 (Ubuntu 4.0.3-1ubuntu1) Copyright (C) 2006 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. -- Summary: std::time_get parses only 2 digits of year, in en_GB locale. Product: gcc Version: 4.0.3 Status: UNCONFIRMED Severity: normal Priority: P3 Component: libstdc++ AssignedTo: unassigned at gcc dot gnu dot org ReportedBy: murrayc at murrayc dot com http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26701
[Bug libstdc++/26701] std::time_get parses only 2 digits of year, in en_GB locale.
--- Comment #1 from murrayc at murrayc dot com 2006-03-15 17:44 --- Created an attachment (id=11054) --> (http://gcc.gnu.org/bugzilla/attachment.cgi?id=11054&action=view) testtimeget.cc -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26701
[Bug libstdc++/26701] std::time_get parses only 2 digits of year, in en_GB locale.
--- Comment #3 from murrayc at murrayc dot com 2006-03-15 17:51 --- That's maybe excusable for display, but all the documentation that I can find for time_get says that it should parse up to 4 year digits. You mention in the other bug that the locale information is used for parsing, but it's obviously an implementation error to use _only_ that information for parsing. -- murrayc at murrayc dot com changed: What|Removed |Added Status|RESOLVED|UNCONFIRMED Resolution|INVALID | http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26701
[Bug libstdc++/26701] std::time_get parses only 2 digits of year, in en_GB locale.
--- Comment #6 from murrayc at murrayc dot com 2006-03-15 18:02 --- Admittedly the libstdc++ time_get::get_date() documentation does say that it interprets it according to format "X", and now I understand what it meant. I was looking at the dinkumware and roguewave documentation. I have just looked at the C++ standard, and you are right: "Each get member parses a format as produced by a corresponding format specifier to time_put<>::put. " " "In other words, user confirmation is required for reliable parsing of user-entered dates and times, but machine-generated formats can be parsed reliably. This allows parsers to be aggressive about interpreting user variations on standard formats. " I guess those other Standard C++ Library implementation are expanding on the standard, and it's not clear whether they should do that. Thanks. My disappointment is now with the C++ Standard instead. If only it had a bugzilla. -- murrayc at murrayc dot com changed: What|Removed |Added Severity|enhancement |normal Status|NEW |SUSPENDED http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26701
[Bug c++/21279] New: static Derived-to-Base cast fails when ~Derived has run.
In this test case, I cast a Derived* to a Base* after the destructor for the Derived part has run. It's a strange thing to do, but I think the cast should succeed, because it does not need instance information. It only seems to happen when using virtual inheritance. We noticed this in libsigc++. -- Summary: static Derived-to-Base cast fails when ~Derived has run. Product: gcc Version: 4.0.0 Status: UNCONFIRMED Severity: normal Priority: P2 Component: c++ AssignedTo: unassigned at gcc dot gnu dot org ReportedBy: murrayc at murrayc dot com CC: gcc-bugs at gcc dot gnu dot org http://gcc.gnu.org/bugzilla/show_bug.cgi?id=21279
[Bug c++/21279] static Derived-to-Base cast fails when ~Derived has run.
--- Additional Comments From murrayc at murrayc dot com 2005-04-29 08:20 --- Created an attachment (id=8761) --> (http://gcc.gnu.org/bugzilla/attachment.cgi?id=8761&action=view) test_castdeleted.cc This fails (gives odd values) with g++ 3.3, 3.4 and 4.0. -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=21279