Page 1 of 1

Scale of cube affecting depth buffer value - weird!

Posted: Fri Apr 29, 2016 2:05 pm
by robmar
In my scene I have a mesh of a fish, and below it a cube set to be the flat bottom of the tank, with the y-scale set to 0.1 so its flat.

I've noticed that as I scale the cube up on the x and y scales to cover more area, that its depth buffer value reduces like its getting father away!

Its top surface is in the exact same position, just the sides move outwards.

I'm viewing the depth buffer in a texture mapped to a quad - just for testing.

Why on earth would scaling a flat object on X and Y scales make the value of its pixels in the depth buffer change?

Re: Scale of cube affecting depth buffer value - weird!

Posted: Fri Apr 29, 2016 3:56 pm
by robmar
The scaled mesh is in the same position in the backbuffer and in the depth buffer, but when I render the objects depth to the texture, the scaled object, although the pixels are in the right position, have lower values, so are darker than they should be for their z-position.

This problem only occurs if the objects are scaled. The more scaled the mesh the darker they get when rendered for depth, without moving position though.

I must be missing something in the projection or whatnot that allows for scaling... but what?

Re: Scale of cube affecting depth buffer value - weird!

Posted: Fri Apr 29, 2016 4:34 pm
by robmar
All looks okay, standard proj * view * world.

In the vertex shader for depth, I used Out.Depth = (Out.Position.z * Out.Position.w) / canFarZ;

Render that in the pixel shader in r g and b for gray scale.

Anyone any idea what might cause scaled objects to result in a reduced Out.Depth value, although still rendered in the same place?

I know that non-uniform scaling causes problems with normals requiring an inverse transpose matrix, but why would scaling affect the depth calculation?

Out.Position.w, could that be affected, such as reduced for scaled meshes?

Re: Scale of cube affecting depth buffer value - weird!

Posted: Fri Apr 29, 2016 11:25 pm
by mongoose7
Maybe divide by W? I don't know if it will help, but multiplying by W is certainly wrong.

Re: Scale of cube affecting depth buffer value - weird!

Posted: Sat Apr 30, 2016 7:29 pm
by robmar
Thanks, I totally missed that!! This is code from another example, where the divide is made as you mention.

Code: Select all

VS_SHADOW_OUTPUT RenderShadowMapVS(float4 vPos: POSITION)
{
    VS_SHADOW_OUTPUT Out;
    Out.Position = GetPositionFromLight(vPos); 
    // Depth is Z/W.  This is returned by the pixel shader.
    // Subtracting from 1 gives us more precision in floating point.
    Out.Depth.x = 1-(Out.Position.z/Out.Position.w);    
    return Out;
}

Re: Scale of cube affecting depth buffer value - weird!

Posted: Sun May 01, 2016 11:57 am
by robmar
Thanks Mongoose7! Sometimes I go code-blind, you know, staring at the same thing so much I miss the obvious! :D

Using Out.Depth = (Out.Position.z * Out.Position.w) / canFarZ; Where camFazZ was the depth-camera's far depth, caused incorrect assignment of depth to some objects, but results in a very nice gray-scale range (albeit wrong in places!)

If I divide the Out.Position.z / Out.Position.w I get such a small range of depth change in the visible objects that the human eye cannot see it.

If I use just Out.Position.z then I can see the depth variations in the gray-scale depth map.

But what are the consequences of not dividing by .w? Is it necessary to have correct depth values in the texture?

As the range of change is so small over the visible objects area, I need to be able to adjust the "contrast" of the depth map to make the details visible to the human eye.

I seem to be getting the depth data of interest in 1.0 to 0.9 range, thought I could adjust the camera near and far to spread that across 1.0 to 0.0, but it doesn`t seem to work.

Re: Scale of cube affecting depth buffer value - weird!

Posted: Sun May 01, 2016 1:10 pm
by mongoose7
I don't know why you want the depth values. If it is for shadow mapping then they must be correct.

The projection matrix maps Z values to either [-1, 1] or [0, 1], I can't remember. But you can map the Z/W values to [0, 1] with (Z/W - Near)/(Far - Near).

Re: Scale of cube affecting depth buffer value - weird!

Posted: Sun May 01, 2016 5:19 pm
by robmar
For depth calculation, used for fish collision avoidance.

I´ve been trying to find an HLSL reference that gives a good overviews of these values, .z range from -1 to +1, and how .w is calculated, but so far haven't found much.

depth = (z/w - near)/(far - near) sounds interesting, the near and far values are actual used values, like 1.0 for near, and 2000 for far, or must be converted to 0 to 1 scale somehow?

So, if z/w was 0.5, near say 0, far 1000, that's 0.5/1000, which will make the grayscale map very dark when viewed.

With a far of say 10,000, objects at 1,500 from the camera need to be depth = 0.85

I'd thought that the near and far depths set the range 0 to 1, but that doesn't seem to happen *edit, it does happen! Dividing by far depth works, case closed!

Thanks for the help!

Re: Scale of cube affecting depth buffer value - weird!

Posted: Mon May 02, 2016 8:01 am
by robmar
FOrgot to mention, that dividing by .w resulted in a very dark gray scale range, so:-

Out.Depth.x = 1.0 - (Out.Position.z/farDepth); // farDepth same as camera far depth float

This produces a grayscale from white (at camera) to black (at far point).

So what value is stored in .w, and how should it be used if not to correct the perspective divide?

Re: Scale of cube affecting depth buffer value - weird!

Posted: Mon May 02, 2016 1:23 pm
by mongoose7
In the vertex shader, W is 1 and is not affected by the object and view transformations. When the projective matrix is applied, Z --> W so W is generally not 1. Perspective divide is performed last. Apparently modern GPUs carry W through to the end. Culling is performed virtually without the perspective divide.

Re: Scale of cube affecting depth buffer value - weird!

Posted: Mon May 02, 2016 4:20 pm
by robmar
Well, W is definitely not 1 in the vertex shader on my Windows 10 PCs with nvidia and another with amd drivers, that's for sure!

"W is generally not 1." - generally, is that like sometimes? :)

Did you find any exact references on this topic for modern GPUs under HLSL 3.0 ?

Re: Scale of cube affecting depth buffer value - weird!

Posted: Tue May 03, 2016 12:43 am
by mongoose7
Well, I don't know where W is set, but it is not changed by the M and V transforms. P changes W to the former Z value, so is *generally* not 1, 'generally' meaning mostly. I have no references.

Re: Scale of cube affecting depth buffer value - weird!

Posted: Tue May 03, 2016 7:29 am
by robmar
Strange that this information can't easily be found on the web, makes programming much more trial and error than it should be, but that's computers for ya!