Thanks, all.
I think that answers the question. Really, Wayland is not an appropriate
place for proper multi-GPU processing (which would be invisible to an
application). So, if it were to be done "the right way", it would be in the
compositor, and the compositor needs better support from DRM and o
Hi,
On 2 October 2014 23:13, Dave Airlie wrote:
> On 1 October 2014 03:15, Daniel Stone wrote:
> > On 30 September 2014 16:44, Jasper St. Pierre
> wrote:
> >> It's a great question, with a complicated answer. Part of this is the
> >> fault of the DRM kernel interface, which is being improved.
On 1 October 2014 03:15, Daniel Stone wrote:
> Hi,
>
> On 30 September 2014 16:44, Jasper St. Pierre wrote:
>>
>> It's a great question, with a complicated answer. Part of this is the
>> fault of the DRM kernel interface, which is being improved. Part of it is
>> the fault of GL/EGL, which really
Hi,
On 30 September 2014 16:44, Jasper St. Pierre wrote:
> It's a great question, with a complicated answer. Part of this is the
> fault of the DRM kernel interface, which is being improved. Part of it is
> the fault of GL/EGL, which really doesn't have proper multi-GPU support.
>
EGL_EXT_devic
It's a great question, with a complicated answer. Part of this is the fault
of the DRM kernel interface, which is being improved. Part of it is the
fault of GL/EGL, which really doesn't have proper multi-GPU support.
GPUs are exposed through the kernel as /dev/dri/card* devices, with the
first GPU
Xorg has been notorious for it's lack of multi-monitor support. Xinerama
was of course shimmed into place to help make one xscreen out of many.
Xinerama has had several drawbacks, however, one being performance, which
is usually compensated for in the industry by purchasing beefier graphics
cards.