I'm just about finished with the GLX enhancements that I've been working on for the past couple weeks. There's only two little problems left. I've got a solution for one, but the other has me stumped.

One of the extensions that I'm implementing is GLX_OML_sync_control (link below). One of the functions in that extension is glXGetMscRateOML. This function returns the current vsync rate. Since this data is available within X, it seems that it would be better to put the implementation of it in libGL.so (glxcmds.c, actually) rather than in each driver. My first try was to use something like:

GLXContext gc = __glXGetCurrentContext();
ScrnInfoPtr sInfo;
float refresh;

sInfo = xf86Screens[ gc->screen ];
refresh = sInfo->currentMode->VRefresh;

/* Do black magic with refresh to convert float to numer / denom
*/

The problem with that is that xf86screens is in the address space of the X server, so libGL.so can't get at it.

My next try was to use the XVidMode extension (based on xvidtune.c). This was somewhat problematic until I realized that I had to make libGL.so link with libXxf86vm.a. Is this a good idea? The only other libraries that libGL is linked with are "core" libraries (libm, libX11, libc, etc.). Is it okay for libGL to use an XFree86 extension? Is there some better way to do this?

It works right now, but I won't commit anything until I get some sort of a thumbs up. When I do commit it, I will commit it to the texmem branch first.

http://oss.sgi.com/projects/ogl-sample/registry/OML/glx_sync_control.txt



-------------------------------------------------------
This SF.NET email is sponsored by:
SourceForge Enterprise Edition + IBM + LinuxWorld = Something 2 See!
http://www.vasoftware.com
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to