> Each monitor will only use the GPU that it is connected to. FWIW, I find the terminology used in the graphics card PC industry very confusing. In my view, there are 4 different kinds of components to a graphic system:
- Memory: this can be dedicated "video RAM" or just a chunk of your normal RAM, shared with the rest of your system. - Display engine (DE): This is the thing that reads a frame buffer from some memory and sends the corresponding data out to your monitor(s) via the output connector(s). - Video processing unit (VPU): This is a specialized element with support for decoding/encoding some specific video and audio formats. It takes its input from some memory and sends its output to some other part of the memory. E.g. when playing a video, it typically reads the video from the system RAM and sends the decoded output to some (part of a) frame buffer. - GPU: This is a processor dedicated to doing the kind of number crunching used for 3D rendering. It takes its inputs (e.g. textures and scene descriptions) from some memory and outputs the rendered scene to some (part of a) frame buffer. Nowadays these processors have grown sufficient functionality that they can be used for other kinds of number crunching. Most "integrated graphics" are composed of the last 3 components above and use the system's normal DRAM for their memory needs. Most PC's "discrete" graphics cards include all 4 components and they typically can't use the system's normal DRAM in the same way they use their own video RAM. This means that you often can't use your discrete GPU to render a 3D scene into the frame buffer used by the DE of your integrated graphics card. Stefan