On Thu, 25 Mar 2004 18:07:29 -0800
Ian Romanick <[EMAIL PROTECTED]> wrote:

> Well boys and girls, it looks like I took just enough rope to shoot all 
> of us in the foot.  As part of the new interface work, I added a field 
> to the __GLXContextRec to track the fbconfig used to create it.  That 
> field is then only used in driBindContext2 (lib/GL/dri/dri_util.c).  The 
> problem is that this field is OUTSIDE the __DRIContextRec portion of the 
> __GLXContextRec.  When I created the new __DRIContextRec (in 
> include/GL/internal/dri_interface.h in the Mesa tree) added a mode 
> field, and some other fields.

I'm not sure I understand this. You added mode in GLXcontextRec in one
header and in DRIcontexRec in the other header. Is it that simple or am
I missing a more subtle detail?

> 
> Since *_dri.so binaries compiled before dri_interface.h was used (it is 
> currently only used in my private tree) expect mode to be at offset N, 
> but a libGL.so compiled with dri_interface.h has it at offset N+12. 
> Basically, by having the *_dri.so access anything outside the 
> __DRIContextRec part it is impossible to add any fields to 
> __DRIContextRec.  Oops.

dri_interface.h doesn't seem to be used for libGL. In the generated
Makefile the only dependency in Mesa/include/GL/internal is glcore.h.
AFAICT libGL uses GLXcontextRec and DRIcontextRec from glxclient.h.
Should libGL really compile against include files from Mesa? My
understanding was that libGL defines its API and Mesa drivers use it. So
Mesa drivers should use libGL headers and not the other way around.

I'm assuming that dri_interface.h and glcore.h in Mesa/lib/GL/internal
are used to build Mesa drivers in the absence of libGL sources?

Your patch adds mode in DRIcontextRec in both header files. So the
binary layout of GLXcontextRec changes as a consequence. I assume that
GLXcontextRec is only used internally in libGL?

> 
> I did a bit of research, and this bug only exists in DRI CVS.  The 
> exposure is limited to our binary snapshots.  Attached if a patch that 
> fixes the problem.  I am able to detect the bad libGL version in 
> __driUtilCreateNewScreen.  When version 20031201 is detected, the driver 
> will refuse to load.  That prevents weird crashes when a broken libGL is 
> used with a fixed driver.  I don't see a way to fix things to other way 
> around.  If a fixed libGL is used with a broken driver, it will just 
> inexplicably crash.
> 
> There is a way to work around problems like this in the future.  Each 
> *_dri.so can export a new symbol, something like __driDesiredAPIVersion, 
> that contains the highest version the driver wants to use.  If this 
> symbol already existed, libGL could do the following to reject broken 
> drivers:
> 
>     const unsigned *desired_version =
>        (const unsigned *) dlsym(handle, "__driDesiredAPIVersion");
>     if ( *desired_version == 20031201 ) {
>        ErrorMessageF("Driver uses broken API.  Upgrade.");
>        Xfree(driver);
>        dlclose(handle);
>        return NULL;
>     }
> 
> This also provides a mechanism to deprecate the "old" DRI API.  The 
> attached patch does not include any __driDesiredAPIVersion stuff.

Having a safe way to deprecate old API sounds like a good thing. Having
a way to workaround binary incompatibilities that shouldn't exist in the
first place doesn't sound that important to me though since the exposure
is very limited. We should make sure that broken API is never officially
released anyway.

> 
> Comments & opinions are welcome, as always.
> 
> 

Felix


-------------------------------------------------------
This SF.Net email is sponsored by: IBM Linux Tutorials
Free Linux tutorial presented by Daniel Robbins, President and CEO of
GenToo technologies. Learn everything from fundamentals to system
administration.http://ads.osdn.com/?ad_id=1470&alloc_id=3638&op=click
--
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to