https://bugs.kde.org/show_bug.cgi?id=450110
Stefan Hoffmeister <stefan.hoffmeis...@econos.de> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |stefan.hoffmeister@econos.d | |e --- Comment #1 from Stefan Hoffmeister <stefan.hoffmeis...@econos.de> --- I think it would be very helpful to describe the physical connector setup you are running. Case in point: On X11, I have a setup where * Intel iGPU (only) controls internal display and HDMI * Nvidia dGPU (only) controls the USB-C output path (i.e. DisplayPort Alternate Mode et al) * Nvidia is in PRIME offload mode; Intel is primary Now connect an external 4K screen to the Nvidia output path, i.e. USB-C / DisplayPort. Now the *Xorg* process starts consuming between 25% and 40% of one CPU on an otherwise totally idle system. Remove the 4K screen from the Nvidia output path, attach it to the HDMI port (Intel) Now Xorg is totally fine. Some sleuthing with perf suggests that all that CPU is burnt on getting the current system time (gettimeofday / clock_gettime) via vdso and kernel calls, with this originating from the Nvidia driver (510). -- You are receiving this mail because: You are watching all bug changes.