Deferred Lighting in WebGL


Your browser does not support the canvas tag. This is a static example of what would be seen.

Light Rotation X
Light Rotation Y

This post describes an attempt to build a simple form of deferred lighting in WebGL. To keep things simple I'm going to stick to a single directional light and will instead focus on the render-target setup and how we feed data to our lighting shader.

Populating a GBuffer

First we need a GBuffer. This is a buffer that holds the parameters needed by our lighting shaders. In it's simplest form we need a set of buffers that together hold an unlit surface colour (albedo), a surface normal, and a depth value, and for now this should give us all we need. There are other properties that are useful such as roughness, metalness, emissive lighting, etc, but I won't focus on them here.

We need a representation of a depth within our GBuffer so we can reconstruct positions in our lighting shader. WebGL doesn't support depth render targets out of the box. We can instead do this by setting up a standard RGBA 32 bit target and then with a shader packing the depth value into the RGB channels as a 24 bit value. The following 2 functions would allow us to achieve that.

		
vec3 packFloat8bitRGB(float val) { 
  // 24bit encoding, 0-1 depth
  vec3 pack = vec3(1.0, 255.0, 65025.0) * val;
  pack = fract(pack);
  pack -= vec3(pack.yz / 255.0, 0.0);
  return pack;
}
float unpackFloat8bitRGB(vec3 pack) {
  return dot(pack, vec3(1.0, 1.0 / 255.0, 1.0 / 65025.0));
}

Rather than do that though, I'm going to use a WebGL extension. WebGL supports an extension called WEBGL_depth_texture which provides access to a depth render-target which can be attached to the FBO's and used like a standard depth buffer. Not every device supports it, but a lot do. We can access that extension like so...

		
var glDepthTextureExt = gl.getExtension("WEBGL_depth_texture");

Then these two snippets of code show how we build a depth texture, and then how we attach that to an FBO.

		
  texture = gl.createTexture();
  gl.bindTexture(gl.TEXTURE_2D, texture);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
  gl.texImage2D(gl.TEXTURE_2D, 0, gl.DEPTH_STENCIL, width, height, 0, 
    gl.DEPTH_STENCIL, glDepthTextureExt.UNSIGNED_INT_24_8_WEBGL, null);
		
  gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.DEPTH_STENCIL_ATTACHMENT, gl.TEXTURE_2D, texture, 0); 

With that in place we can look to populate our depth buffer. WebGL doesn't allow us to write to multiple render-targets at the same time (MRT) so we are going to end up with two passes, each populating RGBA targets with colour and then normal data, each with an attached depth target. Colour values can simply be read from a texture and copied into the colour target. For now we don't need any more than that. For the normals though we need something a little more complex. First we are going to store the normals in eye space (relative to the camera orientation) as this simplifies some of the work we'll do later, and we are also going to sample a normal map and incorporate the normals into our final normal along with the interpolated vertex normals. For this I'm going to use the following shader code, where v_ indicates a varying passed from the vertex shader and u_ indicates a uniform (shader constant).

	
  vec3 N = v_eyeNormal.xyz;
  vec4 normalMap = texture2D(normalSampler, v_tc0.xy);
  vec3 Npixel = normalMap.xyz * 2.0 - 1.0;
  vec3 biNormal = cross(v_eyeNormal, v_eyeTangent);
  Npixel = (v_eyeTangent * Npixel.x + biNormal * Npixel.y + v_eyeNormal * Npixel.z);
  N += (Npixel * u_normalMapScale);
  N = normalize(N);

Note that we also need to apply an offset and scale to the normal before storing into a target in order to pack the -1 to 1 range into the 0 to 1 range of our texture.

The Lighting Shader

For our deferred directional light all we need to do is render a quad across the viewport and then for each pixel pickup our GBuffer parameters and calculate the lighting contribution. For a directional light the only additional data we need are some properties of the light, say direction and colour, and the inverse of the projection matrix so that we can reconstruct the position from a depth value. The light direction needs to be in eye space before we upload to the shader to match the GBuffer parameters, like this...

	
  mat4.multiplyVec3(viewMtx3x3, worldSpaceLightDirection, eyeSpaceLightDirection);

The shader needs to be able to turn a depth value into an eye space position. To do this we can reconstruct a clip space position and back project into view space. We already uploaded the inverse projection matrix, so that code looks like this...

  float clipDepth = -1.0 + texture2D(depthSampler, texCoord.xy).r * 2.0;
  vec4 eyePos = u_projInvMtx * vec4(clipPosXY, clipDepth, 1.0);
  eyePos.xyz = eyePos.xyz / eyePos.w;

Even though I've not really got any meaningful data to throw at it I'm going to try to implement a PBR compatible lighting shader. I won't go into the detail of how this works but there is a lot of very detailed documentation available online covering the various parts of this implementation, for example here and here. My own code looks like this...

    // [10e-5 to avoid /0 for Fs where NdotV == 0]
  float NdotV = max(dot(N, V), 10e-5);
  float NdotL = max(dot(N, L), 0.0);
  vec3 BRDF = vec3(0.0, 0.0, 0.0);
  if (NdotL > 0.0)
  {
    vec3 F = F_schlick(VdotH, F0);
      // [0.005 to avoid sub pixel highlights where M == 0]
    float D = D_trowbridgeReitzCGX(NdotH, max(Msquared, 0.005));
    float G = G_smithSchlick(NdotL, NdotV, M);
    vec3 Fs = (F * D * G) / (4.0 * NdotL * NdotV);
    BRDF = (Kd + Fs) * lightColour * NdotL;
  }

In that code snippet L is the light direction, V is a vector pointing towards the viewer, and N is the surface normal.

For reference the implementations of the 3 parts of the PBR lighting look like this:

vec3 F_schlick(float VdotH, vec3 F0) {
  return F0 + (1.0 - F0) * pow(1.0 - VdotH, 5.0);		
}
float D_trowbridgeReitzCGX(float NdotH, float Msquared) {
  float alpha = Msquared * Msquared;
  float t = ((NdotH * NdotH) * (alpha - 1.0) + 1.0);
  return alpha / (3.14159265359 * t * t);
}
float G_smithSchlick(float NdotL, float NdotV, float M) {
  float k = (0.8 + 0.5 * M);
  k *= k;
  k *= 0.5;
  float gv = NdotV / (NdotV * (1.0 - k) + k);
  float gl = NdotL / (NdotL * (1.0 - k) + k);
  return gv * gl;
}

One final thing to note is that the material textures used here were sourced form freepbr.com.

Leave a Reply

Your email address will not be published. Required fields are marked *