On Fri, Dec 05, 2003 at 12:14:18AM +0100, Felix K�hling wrote: > Hi, > > as I'm trying to port the savage driver I stumbled over this. The mga > and r128 drivers define different values for DEPTH_SCALE in xxx_tris.c. > This is a parameter of t_dd_tritmp.h and specifies the minimum > resolvable unit of depth coordinates for computing polygon offsets. In > mgatris.c it is defined as mmesa->depth_scale, while in r128_tris.c it > is defined as 1.0. I couldn't find any difference in the way the two > drivers setup vertices (including the hw_viewport matrix **) that would > explain the difference. I'm wondering which one is correct. My guess > based on my vague memory of how the projection transformation works is > mga, but I may be wrong. Any ideas?
At least on mga glean's polygonOffset test fails with 1.0. It passes with depth_scale and 16bit depth buffer. The test fails with 15bit, 24bit and 32bit depth buffers regardless of this value. I can make 15bit pass with a small adjustment to depth_scale... -- Ville Syrj�l� [EMAIL PROTECTED] http://www.sci.fi/~syrjala/ ------------------------------------------------------- This SF.net email is sponsored by: IBM Linux Tutorials. Become an expert in LINUX or just sharpen your skills. Sign up for IBM's Free Linux Tutorials. Learn everything from the bash shell to sys admin. Click now! http://ads.osdn.com/?ad_id78&alloc_id371&op=click _______________________________________________ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
