On Fri, Aug 9, 2013 at 8:28 PM, Ian Romanick wrote:
> On 08/09/2013 04:22 AM, Martin Andersson wrote:
>>
>> I think I have found an issue in the piglit test.
>>
>> Marek, could you take a look at the attached patch and see if it looks
>> correct. If so I will send it to the piglit list.
>
>
> Wow.
On 08/09/2013 04:22 AM, Martin Andersson wrote:
I think I have found an issue in the piglit test.
Marek, could you take a look at the attached patch and see if it looks
correct. If so I will send it to the piglit list.
Wow. That test is confusing. It would be a lot more obvious (and less
er
If I understand the code correctly, value[3] should be 1.
If it were 0, the bias would be 1, therefore adding 1 and clamping the
alpha (because of the RGBA8 colorbuffer) would always return 1 no
matter what the texture fetch returned.
Anyway, if the texture fetch returned 0x, the test wou
I think I have found an issue in the piglit test.
Marek, could you take a look at the attached patch and see if it looks
correct. If so I will send it to the piglit list.
//Martin
On Tue, Aug 6, 2013 at 11:20 PM, Marek Olšák wrote:
> Sorry, I have no idea. You can try to remove support for RGBX
Sorry, I have no idea. You can try to remove support for RGBX integer
formats and see if it helps.
In is_format_supported, return FALSE for all R?G?B?X_?INT formats.
Marek
On Sat, Aug 3, 2013 at 11:51 AM, Martin Andersson wrote:
> Well, I should have been more clear. If I do this:
>
> 263: valu
Well, I should have been more clear. If I do this:
263: value[3] = 0;
290: expected[3] = 1.0;
The test always passes, but if I only do this:
290: expected[3] = 1.0;
The test fails with this error:
texture-integer: failure with format GL_RGB8I_EXT:
texture color = 92, 126, 14, 104
expected
FragShaderText contains the shader code. Anyway, we have found the
issue: expected[3] should really be set to 1.0, because RGB formats
must return (r,g,b,1). It's a bug in the piglit test.
Marek
On Fri, Aug 2, 2013 at 11:44 PM, Martin Andersson wrote:
> On Fri, Aug 2, 2013 at 2:52 PM, Marek Olšá
On Fri, Aug 2, 2013 at 2:52 PM, Marek Olšák wrote:
> The format doesn't have alpha. See what the texture fetch writes to
> the alpha channel.
I looked at the code but I can't figure out where the texture fetch
happens, could you point me in the right direction?
>
> You may try setting "texture-i
The format doesn't have alpha. See what the texture fetch writes to
the alpha channel.
You may try setting "texture-integer.c:290" to "expected[3] = 1.0;"
Marek
On Fri, Aug 2, 2013 at 2:15 PM, Martin Andersson wrote:
> Hi,
>
> I started to look at why the spec/!OpenGL 3.0/gl-3.0-texture-integer
Hi,
I started to look at why the spec/!OpenGL 3.0/gl-3.0-texture-integer
sometimes fails on my AMD 6950, using mesa master. It fails with
errors like this:
texture-integer: failure with format GL_RGB8I_EXT:
texture color = 100, 9, 71, 0
expected color = 0.25, 0.5, 0.75, 0
result color = 0.2
10 matches
Mail list logo