On 05/09/2023 10:28, Takashi Yano wrote:
Previous wait time of 100msec is too long if application specifies smaller buffer. With this patch, the wait time is reduced to 1msec.
I don't really have the context to understand this change, but it seems to me the obvious questions to ask are:
Are there negative consequences of making this wait much smaller (i.e. lots more CPU spent busy-waiting?)
Your comment seems to imply that the wait time should be proportional to the buffer size and sample rate?
--- winsup/cygwin/fhandler/dsp.cc | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/winsup/cygwin/fhandler/dsp.cc b/winsup/cygwin/fhandler/dsp.cc index e872aa08c..00f2bab69 100644 --- a/winsup/cygwin/fhandler/dsp.cc +++ b/winsup/cygwin/fhandler/dsp.cc @@ -931,8 +931,8 @@ fhandler_dev_dsp::Audio_in::waitfordata () set_errno (EAGAIN); return false; } - debug_printf ("100ms"); - switch (cygwait (100)) + debug_printf ("1ms"); + switch (cygwait (1)) { case WAIT_SIGNALED: if (!_my_tls.call_signal_handler ())