Stephane Marchesin wrote:
Jon Smirl wrote:
I'm using Redhat xorg-x11-6.7.0-2
The addresses coming back from these get_proc_address calls don't look quit
right. When one is used in context->gl.get_program_iv_arb() it segfaults. I'm
using an R128 which does not have hardware shaders.
Should these calls be returning values as if they were supported? I get same
behavior with sw mesa and R128 DRI. They work fine on the ATI driver.
context->gl.gen_programs_arb =
(glitz_gl_gen_programs_arb_t)
glitz_glx_get_proc_address (thread_info, "glGenProgramsARB");
context->gl.delete_programs_arb =
(glitz_gl_delete_programs_arb_t)
glitz_glx_get_proc_address (thread_info, "glDeleteProgramsARB");
context->gl.program_string_arb =
(glitz_gl_program_string_arb_t)
glitz_glx_get_proc_address (thread_info, "glProgramStringARB");
context->gl.bind_program_arb =
(glitz_gl_bind_program_arb_t)
glitz_glx_get_proc_address (thread_info, "glBindProgramARB");
context->gl.program_local_param_4d_arb =
(glitz_gl_program_local_param_4d_arb_t)
glitz_glx_get_proc_address (thread_info, "glProgramLocalParameter4dARB");
context->gl.get_program_iv_arb =
(glitz_gl_get_program_iv_arb_t)
glitz_glx_get_proc_address (thread_info, "glGetProgramivARB");
if (context->gl.get_program_iv_arb) {
context->gl.get_program_iv_arb (GLITZ_GL_FRAGMENT_PROGRAM_ARB,
GLITZ_GL_MAX_PROGRAM_TEX_INDIRECTIONS_ARB,
&context->texture_indirections);
}
Hi,
(disclaimer : maybe I'm totally out of point here. I'm sorry if this is the case, please retain your flames :)
On the SDL mailing list, we had someone reporting a similar problem some time ago : using SDL_GL_GetProcAddress with the nvidia binary drivers, he could get extensions for most OpenGL functions but not for glGetProgramivARB (FYI, on Linux/X11, SDL_GL_GetProcAddress uses a dlopened glXGetProcAddressARB ).
Here's the link to the beggining of the thread : http://twomix.devolution.com/pipermail/sdl/2004-February/059806.html We couldn't solve the issue, and I couldn't reproduce the problem myself.
Now, the problem is I don't see a reason (with my partial view of the problem, but maybe others will do) for these to be related, since :
- nvidia binary drivers provide their own libglx module
- the opengl library used is different
Maybe there's some code in common somewhere, but that you'll have to tell me.
Here's the deal. glXGetProcAddress *NEVER* returns NULL. It returns a pointer to a dispatch function. If you request an unknown function, it will dynamically generate a dispatch for it. Try calling 'glXGetProcAddressARB((const GLubyte*)"glThisFunctionDoesntExist");". Getting a pointer back isn't enough. You have to look at the extension string to be sure the extension is supported. If I'm not mistaken, the GLX spec says that calling a function for an unsupported extension give "undefined" behavior.
There is no way, without great pain (I'm not even sure then), that we could prevent the segfault. Using the pointer without checking for the extension is right up there with not checking the return from malloc.
-------------------------------------------------------
This SF.Net email is sponsored by: Oracle 10g
Get certified on the hottest thing ever to hit the market... Oracle 10g. Take an Oracle 10g class now, and we'll give you the exam FREE.
http://ads.osdn.com/?ad_id=3149&alloc_id=8166&op=click
--
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel
