https://gcc.gnu.org/bugzilla/show_bug.cgi?id=58931

Aaron Graham <aaron at aarongraham dot com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |aaron at aarongraham dot com

--- Comment #3 from Aaron Graham <aaron at aarongraham dot com> ---
This is still a problem in current gcc trunk.

The bug is in the condition_variable::wait_until clock conversion. It doesn't
check for overflow in that math. Since the steady_clock and system_clock epochs
can be very different, it's likely to overflow with values much less than
max().

    template<typename _Clock, typename _Duration>
      cv_status
      wait_until(unique_lock<mutex>& __lock,
                 const chrono::time_point<_Clock, _Duration>& __atime)
      {
        // DR 887 - Sync unknown clock to known clock.
        const typename _Clock::time_point __c_entry = _Clock::now();
        const __clock_t::time_point __s_entry = __clock_t::now();
        const auto __delta = __atime - __c_entry;
        const auto __s_atime = __s_entry + __delta;

        return __wait_until_impl(__lock, __s_atime);
      }

I modified my version of gcc to use steady_clock as condition_variable's "known
clock" (__clock_t). This is more correct according to the C++ standard and most
importantly it makes condition_variable resilient to clock changes when used in
conjunction with steady_clock. Because of this, in my case, it works fine with
steady_clock::time_point::max(), but fails with
system_clock::time_point::max().

Because I made that change and since I don't do timed waits on system_clock
(which is unsafe), the overflow hasn't been a problem for me and I haven't
fixed it.

Reply via email to