Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Setting gl_FragDepth causes lag?

I designed a very basic set of depth shaders for rendering depth to my shadow map depth textures. This is the code for the depth fragment shader that I used :

#version 330 core

in vec3 FragPos;
uniform vec3 lightPos;
uniform float farPlane;

void main()
{
    float depth = length(FragPos - lightPos);
    depth /= farPlane;
    gl_FragDepth = depth;
}

This code isn't much, it simply calculates the distance between a fragment and the light source, normalizes the value by dividing it by the light's far plane distance, and sets gl_FragDepth as the result.

The code works without any errors. I was testing the renderer with just two objects in the scene and one point light source. Later, I pulled a big interior scene, and the FPS dropped from somewhere between 60-70, down to 30-40.

I tried doing some GPU profiling with Nvidia Nsights, and discovered that the glDrawElements for my shadow pass was spending 4 ms. I zeroed down the problem to the final line of code in the fragment shader written above, gl_FragDepth = depth.

What i found out was that if I removed the expression gl_FragDepth = depth, the FPS jumped to 70s, with the draw call taking just 1 ms. Note that, everything else was untouched.

How could setting the gl_FragDepth value, cause low performance?

like image 940
skeelz Avatar asked Oct 29 '25 16:10

skeelz


1 Answers

Writing to gl_FragDepth will disable early fragment tests:

Therefore, an implementation is free to apply early fragment tests if the Fragment Shader being used does not do anything that would impact the results of those tests. So if a fragment shader writes to gl_FragDepth, thus changing the fragment's depth value, then early testing cannot take place, since the test must use the new computed value.

like image 132
genpfault Avatar answered Oct 31 '25 10:10

genpfault