Stepping back from the specifics of the application, the general
question is: How to render a smooth animation representing the value of
a continuously changing variable (the position in the audio track) that
can move in either direction at any time at any velocity and
acceleration? Perhaps we need a PID controller?

On 12/16/20 12:10 PM, Be wrote:

Hi, I am a maintainer of Mixxx DJ software and I'm trying to come up
with a roadmap for the future of our GUI. The current GUI uses a
home-baked XML layout system for QWidgets that predates Qt's UIC. One
of the main features of the application is waveform visualizations of
the audio tracks being played. For smooth scrolling of the waveforms
synchronized with the audio going to the audio interface, the waveform
renderers need to determine at the time of rendering when the next
buffer swap onto the screen will occur. This is currently implemented
with QGLWidget by disabling auto buffer swapping and calling
QGLWidget::swap from a thread which synchronizes with VSync. If you
want to see the current implementation, look at the VisualPlayPosition
and VSyncThread classes:
https://github.com/mixxxdj/mixxx/blob/main/src/waveform/visualplayposition.h
https://github.com/mixxxdj/mixxx/blob/main/src/waveform/vsyncthread.cpp

Unfortunately, this is not a viable solution going forward. QGLWidget
has been removed in Qt 6. Even before that, the current implementation
causes severe performance problems that make the application unusable
when built with a macOS SDK >= 10.14. From what I understand, this is
because of layer backed rendering which can only be disabled by
building with the macOS 10.13 SDK, which we have to do now with a hack
downloading the SDK separately so it can be used with up to date
versions of XCode which do not officially support the 10.13 SDK.

We attempted another solution (
https://github.com/mixxxdj/mixxx/pull/1974 ) with QOpenGLWidget using
the QOpenGLWidget::frameSwapped signal to estimate when the rendered
frame would be shown on screen. This performed much better on macOS
with SDKs > 10.13. However, we could not accurately predict the timing
of the frame swap onto the screen because:

> Unlike QGLWidget, triggering a buffer swap just for the
QOpenGLWidget is not possible since there is no real, onscreen native
surface for it. Instead, it is up to the widget stack to manage
composition and buffer swaps on the gui thread. When a thread is done
updating the framebuffer, call update() on the GUI/main thread to
schedule composition.

https://doc.qt.io/qt-5/qopenglwidget.html#threading

The result was that the waveforms did not scroll smoothly. Instead,
they jerked back and forth.

This might be overcome with the approach recommended in the
QOpenGLWidget documentation:

> This involves drawing using an alternative render target, that is
fully controlled by the thread, e.g. an additional framebuffer object,
and blitting to the QOpenGLWidget's framebuffer at a suitable time.

However, we would like to move away from our hacky old QWidget XML GUI
layout system and use QML. Just the other day I decoupled the backend
application logic from QWidgets and got a simple Qt Quick
proof-of-concept GUI working.

We could adapt our legacy OpenGL waveform rendering code to work in a
Qt Quick GUI, however I'm not sure we'd really solve our performance
problems on macOS because the Qt Quick scene graph rendering in a
separate thread is not supported with the OpenGL backend on macOS.

So it seems the most maintainable and performant way forward would be
adapting our existing OpenGL GLSL waveform renderer to
Vulkan-compatible GLSL to use the new Qt Shader Baker. This still
leaves open the question of how to determine when the frame being
rendered will be swapped to the screen. If I understand correctly,
either the application needs to be able to control the swap time or we
need a guarantee from Qt that swaps will occur at a steady,
predictable time. QQuickRenderControl allows the application to
control the render time, but I have not found anything in the Qt 6
documentation about how to control or determine the swap time. From my
quick glance at the Qt 6 source code, it seems this is controlled by
RHI in private APIs. Is this possible? Are there any other solutions
you can suggest?

Potential solutions I have thought about but I don't know if they'd work:

1. If we control the rendering time of the entire window with
QQuickRenderControl, could we measure the difference between the start
of rendering and the QQuickWindow::frameSwapped signal to estimate the
next swap time? I don't know if this would work though because I think
the rendering time would have to be roughly constant. With all the
optimizations the Qt Quick scene graph does, I'm guessing the
rendering time can vary dramatically depending on what needs updating.

2. Another approach might be using QQuickRenderControl and
QQuickRenderTarget to render a QQuickWindow to an offscreen buffer so
the application can determine when to swap that buffer on screen. I'm
not sure sure how feasible this is. We still need an on screen QWindow
to receive input, resize, and hide/show events which would need to be
redirected to the offscreen QQuickWindow. As far as I can tell, the
only way we might be able to do that with thie existing Qt 6 APIs is
with a QOpenGLWindow which shares a QOpenGLContext with the
QQuickWindow, so we'd be bound to OpenGL and couldn't take advantage
of the other RHI backends.

By the way I never got the registration confirmation email when I
tried to sign up for this mailing list with my mixxx.org email address.
_______________________________________________
Interest mailing list
Interest@qt-project.org
https://lists.qt-project.org/listinfo/interest

Reply via email to