Hello,

I am trying to render a 10-bit image from an OpenGL fragment shader attached to 
a QQuickItem in a simple QML scene.  My gfx driver & display support 10-bit, 
and I know this already works if I create a standalone QGLWidget with a 10-bit 
format.  To get this working in QML, I have tried setting my QQuickView's 
format to 10-bit as follows:

QSurfaceFormat fmt;
fmt.setRedBufferSize(10);
fmt.setGreenBufferSize(10);
fmt.setBlueBufferSize(10);
fmt.setRenderableType(QSurfaceFormat::OpenGL);
QQuickView view;
view.setFormat(fmt);
view.setSource(...);
view.show();

I also register a QQuickItem and render some patterns from a fragment shader in 
there.  When I ask this item's QQuickWindow member for its current format via 
window()->format(), it reports back the same 10-bit format I requested earlier, 
however if I ask for the format of the window's opengl context via 
window()->openglContext()->format(), then it reports an 8-bit format.

Note that I am not using ANGLE - process explorer shows that my app is in fact 
loading opengl32.dll and not directx.

So I have two questions:

*        Why does my OpenGL context format not match the window format?

*        Am I taking the correct approach, or which steps might I be missing to 
achieve a 10-bit backbuffer for my QQuickItem to draw into?

It has been suggested that I might achieve 10-bit rendering by displaying a 
10-bit texture in the scene graph via QSGSimpleTextureNode and a custom 
QSGTexture class.  I will try this approach as well.

Thanks, Matt

Matt McLin
Software Architect: Healthcare Division
Barco, Inc.

This message is subject to the following terms and conditions: MAIL 
DISCLAIMER<http://www.barco.com/en/maildisclaimer>
_______________________________________________
Interest mailing list
Interest@qt-project.org
http://lists.qt-project.org/mailman/listinfo/interest

Reply via email to