https://bugs.kde.org/show_bug.cgi?id=467182

--- Comment #3 from theplexg...@gmail.com ---
I would like to point out that neither Proportional nor Centered mode are a
usable workaround for the bug. They do technically prevent the bug from
happening, but they also prevent me from being able to move the mouse out of
the way of things I'm trying to read/interact with. 

I would also like to point out that this is, unlike what others have mentioned,
not a bug on other desktops.

Windows treats the edges of the monitor you're currently on as the bounding box
for the zoom area, even if Windows also renders multiple monitors as virtual
screens on a giant virtual workspace. The Windows Magnifier has handled this
properly ever since full-screen zoom was added in Windows 7.

macOS also handles things the same way Windows does. I think GNOME does as
well, however there's a more severe rendering bug that makes my computer
literally unusable. I've reported that months ago on the GNOME GitLab as well.

I've looked at the code for Push mode in Kwin desktop zoom. I would love to
attempt a fix, however I have no idea what I'm doing in terms of building and
testing a from-source Kwin build. 

<code>
case MouseTrackingPush: {
    // touching an edge of the screen moves the zoom-area in that direction.
    int x = cursorPoint.x() * zoom - prevPoint.x() * (zoom - 1.0);
    int y = cursorPoint.y() * zoom - prevPoint.y() * (zoom - 1.0);
    int threshold = 4;
    xMove = yMove = 0;
    if (x < threshold) {
        xMove = (x - threshold) / zoom;
    } else if (x + threshold > screenSize.width()) {
        xMove = (x + threshold - screenSize.width()) / zoom;
    }
    if (y < threshold) {
        yMove = (y - threshold) / zoom;
    } else if (y + threshold > screenSize.height()) {
        yMove = (y + threshold - screenSize.height()) / zoom;
    }
    if (xMove) {
        prevPoint.setX(std::max(0, std::min(screenSize.width(), prevPoint.x() +
xMove)));
    }
    if (yMove) {
        prevPoint.setY(std::max(0, std::min(screenSize.height(), prevPoint.y()
+ yMove)));
    }
    xTranslation = -int(prevPoint.x() * (zoom - 1.0));
    yTranslation = -int(prevPoint.y() * (zoom - 1.0));
    break;
}
</code>

(Apologies, I don't know how to pre-format code in Bugzilla)

The bug is in the part of the code where it detects whether the mouse has hit
an edge of the virtual screen. It allows the mouse cursor to go outside of your
physical screen and into the norrmally-hidden black bar, but doesn't detect
this happening. I'm aware you can work around it by moving the mouse to another
monitor and continuing to pan, and this is what I currently do as described in
the video.

A possible solution could be to track which physical screen the mouse is from
the perspective of the zoom area, and find the edges of the screen that aren't
adjacent to another screen. These would be the outer edges of the user's actual
workspace. Edges that are adjacent to another screen are called inner edges.

Then, if the mouse reaches an inner edge, it's allowed to pass through to that
screen without the zoom area panning. This lets you move your mouse to other
physical screens.

If the mouise reaches an outer edge, however, then this is when you pan. This
makes it so the mouse is always on one screen, and the user pans the zoom area
by pushing against the edge of one of their outer monitors instead of the edge
of the workspace itself. The mouse will never be able to reach those invisible
black bars.

-- 
You are receiving this mail because:
You are watching all bug changes.

Reply via email to