Loading Textures In WebGL

Your browser does not support the canvas tag. This is a static example of what would be seen.

This post covers loading a texture, and sampling it inside a shader using WebGL.

This is the function we are going to use to load the texture data.

function loadTexture(gl, imageURL) {

  var texture = {}

  /* check for extensions */
  var glTextureAnisoExt = gl.getExtension("EXT_texture_filter_anisotropic");
 
  /* create a texture object */
  texture.textureObject = gl.createTexture();
 
  /* the texture is going to be a flat colour (for now) */
  var pixel = new Uint8Array([255.0, 255.0, 255.0, 255.0]);
 
  /* setup state */
  gl.bindTexture(gl.TEXTURE_2D, texture.textureObject);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.REPEAT);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.REPEAT);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR_MIPMAP_LINEAR);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
  if (glTextureAnisoExt != null) {
    gl.texParameterf(gl.TEXTURE_2D, glTextureAnisoExt.TEXTURE_MAX_ANISOTROPY_EXT, 2);
  }     
  gl.bindTexture(gl.TEXTURE_2D, null);
  
  /* fill with the flat colour for now - ensures we can use it before its loaded */
  gl.bindTexture(gl.TEXTURE_2D, texture.textureObject);
  gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 1, 1, 0, gl.RGBA, gl.UNSIGNED_BYTE, pixel);
  gl.bindTexture(gl.TEXTURE_2D, null);
  
  /* hook a callback to process the loaded image */
  var image = new Image();
  image.onload = function(textureObject, image) {
    return function() {
      gl.bindTexture(gl.TEXTURE_2D, textureObject);
      gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
      gl.generateMipmap(gl.TEXTURE_2D);
      gl.bindTexture(gl.TEXTURE_2D, null);
      }
    } (texture.textureObject, image);
   
  /* trigger the load */
  image.src = imageURL;

  return texture;
}

There are a few things that need explaining here.

First is that we actually populate the texture with data twice. We first setup the texture to be a 1×1 white texture and then later load the real texture data into it. We do this because the load is asynchronus, meaning the texture data will be filled in some time after the function completes, whenever the load completes, and we may want to be able to bind the texture while we are waiting, in order that we don’t delay the initial render of the canvas. Binding an empty texture will crash WebGL but doing it this way makes it completely safe.

You might also notice that we generate mip maps of our texture. This is via the call to gl.generateMipmap. Mip maps will improve the quality of the texture when it’s resized and would be recommended for most textures. Building them requires the source texture dimensions be powers of two though, so once this is enabled you need to ensure the content is built to be mip-map capable. I’d recommend it though…

Finally, we also use an extension to allow for anisotropic filtering. This further improves the quality of the texture filtering. Google ‘anisotripic filtering’ if you want to see what it’s for.

Once that function is established we can load a texture like this.

texture = loadTexture(gl, 'goat_on_grass_194991_cropped.jpg');

We need a bit more code to actually make use of the texture though.

First the fragment shader needs to be modified to sample a texture. I’ve added a sampler called colourSampler here and a varying called v_texCoord where we will recieve UV coordindates from the vertex shader.

precision mediump float;
varying vec4 v_texCoord;
uniform sampler2D colourSampler;
void main(void) {
	gl_FragColor = texture2D(colourSampler, v_texCoord.xy);
}

Then the vertex shader provides UV coordinates for the texture, in this case automatically by scaling and biasing the vertex position. Note that on OpenGL (or WebGL in this case) V’s are flipped vs how they might appear in other shading languages.

attribute vec3 in_position;
varying vec4 v_texCoord;
uniform vec4 u_texCoordScale;
void main(void) {
	v_texCoord.xy = in_position.xy * u_texCoordScale.xy * vec2(0.5, -0.5) + 0.5;
	gl_Position = vec4(in_position, 1.0);
}

Then we add this just after creating the shader to lookup the location of the colour sampler uniform. We store it for later use.

shader.colourSampler_loc = gl.getUniformLocation(shader.program, "colourSampler");
shader.texCoordScale_loc = gl.getUniformLocation(shader.program, "u_texCoordScale");

Finally we add this to the draw call code to bind the texture. WebGL is a bit odd in that there is an extra level of indirection for sampler bindings, so we have to bind the texture oject to slot 0 and then bind slot 0 to the uniform. It would be nicer to just bind the object to the uniform directly, but thats not how WebGL likes to do things.

gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, colourTexture.textureObject);
gl.uniform1i(boundShader.baseColourSamplerUniform, 0);	

For fun I’ve also added a time value to the code thats updated while animating and used to control the scale for the draw call like so.

gl.uniform4f(
  shader.texCoordScale_loc,
  0.9 + 0.1 * Math.sin(time),
  0.9 + 0.1 * Math.sin(time),
  0.0, 0.0);

Leave a Reply

Your email address will not be published. Required fields are marked *