Hi! I would like to reimplement the MediaPlayerPrivateQt class in QtWebKit in Qt 5.0.2 to use hardware acceleration for rendering video on Raspberry Pi (I got some info from here: https://bugs.webkit.org/show_bug.cgi?id=86410). I suppose that by returning true from virtual bool MediaPlayerPrivateQt::supportsAcceleratedRendering(), I should get some calls to virtual void MediaPlayerPrivateQt::paintToTextureMapper(...). Is this assumption correct? In that case, who is supposed to call that?
I suppose that this would be done in the web process, and that I should somehow place decoded frames in the GPU memory in a way that is accessible from the ui process. Assuming I can do this, can somebody point me to the place where in the sources the "drawing" in the ui process takes place? The idea would be to "pack" data somehow int the web process and "unpack" it in the ui process to draw the OpenGL texture. Is there someone who can point me to this couple of points in the code to get me started (I understand that my ideas are really vague yet)? Thanks! Luca _______________________________________________ Interest mailing list Interest@qt-project.org http://lists.qt-project.org/mailman/listinfo/interest