Small update, it turns out GL_viv_direct_texture can only be used to upload
textures to the GPU, not the other way around. I did try this out to be
sure, no success.
The recommendations I get from NXP is to use their "virtual framebuffer"
kernel module, that creates as many virtual framebuffers I'
Guys,
So much regressions in QCamera in Qt 5.9 and 5.10... Don't remember in
what version all was ok, but now this is something horrible to work with
QCamera in Linux. What was changed?
On 23.03.2018 12:47, Igor Mironchik wrote:
Hello.
Let's say I initialized QCamera with right settings (o
Hello.
Let's say I initialized QCamera with right settings (on Linux this is
/dev/video0)
QCamera * cam = new QCamera( ... );
cam->start();
When the camera is started QCameraInfo::availableCameras() returns list
of cameras without active camera. This is a regression from pre 5.9 Qt.
Is thi
Using vsync as a frame rate limiter is not a good idea, because of
different refresh rates between monitors. On a 144Hz monitor, you'd be
running at 144FPS even if that's not actually wanted.
Or, you might want to run at half the refresh rate (like 30FPS on a 60Hz
monitor.)
Vsync and frame l
Qt3D relies on your GPU vsync settings. To reach the maximum number of
FPS, just disable vsync in your driver settings. If on the other hand
you want to reduce FPS, call setSwapInterval on the
QSurfaceFormat::defaultFormat and set it before starting your Qt3D
application with QSurfaceFormat::setDef
Hi Gunnnar,
thanks a lot, this gives me a good starting point. I didn't know I could
use GL_viv_direct_texture "in reverse" like this, I'll try it out. Already
using it to stream decoded h264 video to a texture for rendering to the
screen.
I don't know which one will work best, but you want to av