Depth-writing shaders should work. Although it is very hard to know
whether they work or not given the depth-buffer can't be read. But it
should be possible to verify by checking the values with depth-testing.
If it doesn't work then sounds like bug.
Jose
On Wed, 2011-01-26 at 02:25 -0800, stef..
Hi Jose,
thanks for the info, i got it working now:
- Instead of reading depth I just render 256 greylevel quads and
readback the color buffer (performance is not an issue here,
I just need an image that tells me that the depth buffer is ok)
- The GLSL depth-write shader doesn't work when t
D3D9 API limits the blits to/from depth-stencil buffers as well. The API
is pretty much designed to ensure that depth-stencil buffers stay in
VRAM (probably in a hardware specific layout) and never get out of
there.
Several vendors allow binding the depth buffer as a texture, but they
implicitly d
Hi Jose,
thanks for the quick reply: I'm using Win7 for both, guest (32bit)
and host (64bit).
I do the depth buffer reads only for debugging / regression testing.
Would a copy depth-to-texture and shader blit to the color channels
work ? Reading the color back buffer via glReadPixels is ok.
Re
On Tue, 2011-01-25 at 01:13 -0800, stef...@starnberg-mail.de wrote:
> Hi,
>
> i'm trying to get one of our testsuites running in VMWare
> (VMware, Inc. Gallium 0.3 on SVGA3D; build: RELEASE; OGL 2.1 Mesa
> 7.7.1-DEVEL).
> With the GDI backend everything works fine (tested in 7.7,7.8,7.10).
>
>
Hi,
i'm trying to get one of our testsuites running in VMWare
(VMware, Inc. Gallium 0.3 on SVGA3D; build: RELEASE; OGL 2.1 Mesa
7.7.1-DEVEL).
With the GDI backend everything works fine (tested in 7.7,7.8,7.10).
I have a glsl shader that writes depth like
void main()
{
vec4 v = texure2D(tex