On 07-11-19 11:47, Edward Welbourne wrote:
André Somers (6 November 2019 17:20) wrote
I came to the conclusion that the sane behavior for interpreting dates
depends on the semantics of what the date means. For instance, a birth
date will always be a date in the past,
... except when it's the best-estimate date of birth of an expected
child, or part of a discussion of how (say) an education system will
handle the cohorts of children born in various date ranges.
... neither of which are actual birth dates. The first is an expected birth date, the second something else entirely.
I'll agree,
though, that birth dates are *usually* in the past ;^>

Even when it is in the past, the range of past dates it may land in is
more than a century wide.  Some folk live for more than a century; and
records of dates of birth of folk can be relevant even after the folk in
question are dead.

(All of which argues against using two-digit years in dates of birth,
common though that practice is.)
True. But that does not preclude people from entering such dates. I guess it also depends on what use case you envision for this. For reading data stored in a 2-digit format, you are completely right. But I was thinking more of making date entry work better. I have written controls backed by date parsing code based on logic like this. Yes, you can enter full data, but the control would do the expected thing based even for shorthands like using a 2-digit year. What it would do would depend on the purpose of the date field. The example above were not random: it was medical device software, so it was dealing with birth dates, appointments, etc. So for that one-in-200 patient over 100 years old, you'd use the full 4 digit year when entering the data. For the rest of them the 2 digit version would be enough.

while a date for an appointment would normally be a date in the
future.
and usually not very far in the future, at that, which makes this one
of the cases where two-digit years aren't automatically a bad idea.
True. It helps in the experience with the software if entering common things works quickly and smoothly. Making dates easier to enter can be a win in the time a user needs to enter data, and that can be _very_ valuable, especially if that is something that needs to be done often.

That alters the interpretation of the date. May I suggest adding an
enum argument to any function doing the conversion from a string to a
date that allows you tell you to suggest the kind of date that is
expected?
That would imply inventing (and documenting) how we're going to
implement each member of the enum; and, fundamentally, that's going to
boil down to specifying (per enum member) a range of (up to) 100 years
that each two-digit year value gets mapped into.  Then along comes some
user whose use-case we didn't think of and we need to extend the enum
and the enum grows endlessly.  I think it is easier to let the caller
just specify that year range (principally by its start date).  The
caller can then invent any flavour of year range they like.

Do you really think it would get out of hand? I can't see this growing to more than a hand full, and it would be much easier to use than having to use and read Qt::ExpectPastDate compared to something like QDate::currentDate().year() - 99 as an argument to that function.


André


_______________________________________________
Development mailing list
Development@qt-project.org
https://lists.qt-project.org/listinfo/development

Reply via email to