I had a similar problem in Ubuntu Edgy. 
So I don't want to add a new bug, and subscribe to this bug.

I have a Radeon X300 card with a 30" LCD TV Videoseven LTV30C, the native LCD 
resolution is 1280x768. 
But 1280x1024 is the maximum resolution reported and was detected OK and worked 
fine with a fresh install (automatically selected as the default resolution). 
The problem is with the native resolution 1280x768: It was detected Ok as a 
possible resolution (an advance in comparison with previous ubuntu versions), 
but timings were not OK, so screen flashes.

Reading the X log, I see the additional mode data reported by the
monitor. It is ok and works fine if used in a Modeline.

I think this is a bug (or an additional requirement) that the timings of
additional modes reported by the monitor would be used if they are not,
or revised if they are used incorrectly.

I build and sell computers and try people use Linux, but the first thing a 
person see is the screen. I thing it is very important a robust screen 
detection. 
I use 3 widescreen monitors  (17" laptop 1920x1200, 19" Wide 1400x900, and 30" 
Wide 1280x768), with different resolutions and in the last two  I needed to 
edit the config manually.

I am commited to help to make better the user experience, I am used to
make installations in different kinds of hardware, so I can report about
fresh installations without problem.

Thank You

-- 
[mga] mode validation seems to treat ModeLines and DDC dtimings differently
https://launchpad.net/bugs/12502

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to