Gwenole, Pawel, If you can find that ancient code, I would be interested in seeing it. I need to encode to H.264 the output of an OpenGL application for network streaming on a headless Ivy Bridge platform.
Regards, Chris On Wed, Jul 31, 2013 at 8:37 PM, Gwenole Beauchesne <[email protected]>wrote: > Hi Pawel, > > 2013/8/1 Pawel Osciak <[email protected]>: > > > Is there any way to do a zero-copy encode/video processing from a texture > > source? This would I guess involve turning a texture/X drawable into a > > VASurface? I think Gwenole mentioned some extensions he's been working on > > back in April 2012, but I haven't found any of this in the code or > > examples... > > The Intel driver supports GEM buffer and DMA buffer imports. So, if > you can expose a GEM buffer, or a dma_buf from a texture, then you can > create a VA surface from it. Then, you can use VA/VPP to convert to > NV12 tiled, and kick encoding from it. This is what we do for Weston > to encode from the EGLSurface for example. > > I could dig some ancient code at the office if you want, with no > guarantee it still works. :) > > Note: the recommended way to encode will be through some middleware > (libgstvaapi, OMX, Media SDK), unless you really want to spend some > time on tuning the encoding process with libva. > > Regards, > Gwenole. > _______________________________________________ > Libva mailing list > [email protected] > http://lists.freedesktop.org/mailman/listinfo/libva >
_______________________________________________ Libva mailing list [email protected] http://lists.freedesktop.org/mailman/listinfo/libva
