This looks good to me in general -- for Gecko, this combined with offscreen canvas and canvas/webgl in workers is going to be the best way to get performant WebGL-based VR. This is likely going to be the better way to solve the custom-vsync for VR issue; while the large patch queue that I have does work, it adds significant complexity to Gecko's vsync, and is unlikely to get used by any other system. You'll want to make sure that the gfx folks weigh in on this as well.
Some comments -- if you want direct front-buffer rendering from canvas, this will be tricky; you'll have to add support to canvas itself, because right now the rendering context always allocates its own buffers. That is, you won't be able to render directly to the texture that goes in to an Oculus compositor layer textureset, for example, even though that's what you really want to do. But I'd get the core working first and then work on eliminating that copy and sharing the textureset surfaces with webgl canvas. Same thing with support for Oculus Home as well as allowing for HTML layers; those should probably be later steps (HTML/2D layers will need to be rendered on the main thread and submitted from there, so timing them between worker-offscreen-canvas layers and the main thread could be tricky). - Vlad On Tue, May 10, 2016 at 6:18 PM Kearwood "Kip" Gilbert <kgilb...@mozilla.com> wrote: > Hello All, > > In order to support features in the WebVR 1.0 API ( > https://mozvr.com/webvr-spec/) and to improve performance for WebVR, I > would like to implement an optimized path for submitting Canvas and > OffscreenCanvas frames to VR headsets. The WebVR 1.0 API introduces "VR > Layers", explicit frame submission, and presenting different content to the > head mounted display independently of the output the regular 2d monitor. I > would like some feedback on a proposed “VR Compositor” concept that would > enable this. > > *What would be common between the “VR Compositor” and the regular “2d > Compositor”?* > - TextureHost and TextureChild would be used to transfer texture data > across processes. > - When content processes crash, the VR Compositor would continue to run. > - There is a parallel between regular layers created by layout and “VR > Layers”. > - There would be one VR Compositor serving multiple content processes. > - The VR Compositor would not allow unprivileged content to read back > frames submitted by other content and chrome ux. > - Both compositors would exist in the “Compositor” process, but in > different threads. > > *What is different about the “VR Compositor”?* > - The VR Compositor would extend the PVRManager protocol to include VR > Layer updates. > - The VR Compositor will not obscure the main 2d output window or require > entering full screen to activate a VR Headset. > - In most cases, there will be no visible window created by the VR > Compositor as the VR frames are presented using VR specific API’s that > bypass the OS-level window manager. > - The VR Compositor will not run synchronously with a refresh driver as it > can simultaneously present content with mixed frame rates. > - Texture updates submitted for VR Layers would be rendered as soon as > possible, often asynchronously with other VR Layer updates. > - VR Layer textures will be pushed from both Canvas elements and > OffscreenCanvas objects, enabling WebVR in WebWorkers. > - The VR compositor will guarantee perfect frame uniformity, with each > frame associated with a VR headset pose frame explicitly passed into > VRDisplay.SubmitFrame. No frames will be dropped, even if multiple frames > are sent within a single hardware vsync. > - For most devices (i.e. Oculus and HTC Vive), the VR Compositor will > perform front-buffer rendering. > - VR Layers asynchronously move with the user’s HMD pose between VR Layer > texture updates if given geometry and a position within space. > - The VR Compositor implements latency hiding effects such as Asynchronous > Time Warp and Pose Prediction. > - The VR Compositor will be as minimal as possible. In most cases, the VR > Compositor will offload the actual compositing to the VR device runtimes. > (Both Oculus and HTC Vive include a VR compositor) > - When the VR device runtime does not supply a VR Compositor, we will > emulate this functionality. (i.e. for Cardboard VR) > - All VR hardware API calls will be made exclusively from the VR > Compositor’s thread. > - The VR Compositor will implement focus handling, window management, and > other functionality required for Firefox to be launched within environments > such as Oculus Home and SteamVR. > - To support backwards compatibility and fall-back views of 2d web content > within the VR headset, the VR compositor could provide an nsWidget / > nsWindow interface to the 2d compositor. The 2d compositor output would be > projected onto the geometry of a VR Layer and updated asynchronously with > HMD poses. > - The VR Compositor will not allocate unnecessary resources until either > WebVR Content is accessed or the browser is launched from within a VR-only > environment such as Oculus home, SteamVR, or GearVR. > > Early WIP patches of implementation of the WebVR 1.0 API are in Bug > 1250244. I expect the first implementation to be minimal, but lay the > foundation that will eventually become the VR Compositor. > > Thanks for giving this a review — I look forward to your feedback! > > Cheers, > > - Kearwood “Kip” Gilbert > > > _______________________________________________ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform