DEV Community

Lea Rosema (she/her)
Lea Rosema (she/her)

Posted on

Realtime video processing with WebGL

This is a making-of of a WebGL demo I made recently on CodePen. It grabs the data of a webcam (or a fallback image from placekitten when no access to the webcam is available), and turns it into ASCII art in real time.

For some extra RETROness, I'm using a 8x8 pixel rasterfont that was common in DOS PCs from the 90s (you may still see this kind of font in some BIOSes).

To map the image content to a specific character, I chose the best match by using a luminance map. I counted pixels in each 4x4 square. Scroll down inside the pen to see the luminance map:

I've also created an editor for these kinds of fonts: https://terabaud.github.io/pixelfedit/

Some WebGL Basics

I'm going to introduce some basics of WebGL, but I'm only scratching the surface here. For a more in-depth guide, I recommend you to head to https://webglfundamentals.org

A common misunderstanding about WebGL is that it is a 3D engine in the browser. Although WebGL is the technology that enables us to provide GPU accelerated 3D content in the browser, WebGL itself is not a 3D engine. On top of WebGL, there are graphics libraries for GPU-accelerated 2D or 3D content (like Pixi for 2D, ThreeJS for 3D).

WebGL itself is quite low level and a library to draw points, lines and triangles in a GPU accelerated way onto a html <canvas> element.

The WebGL rendering context can be retrieved via getContext (similar to the 2D canvas API):

const canvas = document.querySelector('canvas');
const gl = canvas.getContext('webgl') || canvas.getContext('experimental-webgl');
Enter fullscreen mode Exit fullscreen mode

A webgl program is consists of shaders. Shaders are pieces of code that are run on the GPU. These are not written in JavaScript but have its own language, called GLSL (GL shader language).

Quick GLSL facts

  • C-like, shader programs have a void main(),
  • variable declaration also like in C
  • Primitive data types: int, float, double
  • Vectors: vec2, vec3, vec4, ...
  • Matrices: mat2, mat3, mat4, ...
  • Type for accessing texture data: sampler2D
  • built-in vector/matrix arithmetics
  • many built-in functions, eg. to get a length of a vector (length(v))

Types of shaders

There are two types of shaders in a WebGL program.

  • The vertex shader computes positions.
  • The fragment Shader handles rasterization.

If your webgl program wants to draw a triangle on screen, it passes 3 coordinates of a triangle to a vertex shader. Then, the task of the fragment shader is to fill that triangle with pixels. This per-pixel processing is super fast, because it is run in parallel for each pixel on the GPU.

Im my demo, I'm using 4 vector coordinates to span up a rectangle that fits the whole screen. All the work is done in the fragment shader.

Vertex shader

So, the vertex shader is there for the vertexes. It get's a bunch of data from a buffer provided in the JavaScript code and computes this data to a position in the canvas.

The following piece of code pulls data from a buffer into an attribute variable and passes it to the gl_Position variable:

attribute vec3 position;

void main() {
  gl_Position = vec4(position, 1.0);
}
Enter fullscreen mode Exit fullscreen mode

Fragment shader

precision highp float;

void main() {
  vec2 p = gl_FragCoord.xy;
  gl_FragColor = vec4(
1.0, .5 + .5 * sin(p.y), .5 + .5 * sin(p.x), 1.0);
}
Enter fullscreen mode Exit fullscreen mode

The fragment shader is run for each fragment (pixel) in parallel. In the example above, the fragment shader reads the current pixel coordinate from the gl_FragCoord variable and sets and output color via gl_FragColor with some sin() magic.

gl_FragColor is a vec4 vector containing (red, green, blue, alpha) with values from 0 .. 1 each.

Types of GLSL variables

  • attribute: the vertex shader pulls a value from a buffer and stores it in an attribute variable
  • uniform: uniform variables are set from the JS side. For example, you can use uniforms to pass something like your current mouse/tap position to your shader. You can also use uniform variables to access texture data uploaded from JavaScript.
  • varying: pass values from the vertex to the fragment shader and interpolate values.

Uploading image data to your WebGLRenderingContext

You can access image data to your WebGLRenderingContext inside your shader and upload them into textures. (See Also: WebGL Fundamentals: Image Processing)

You can use the texImage2D method inside the WebGLRenderingContext to upload image data into a texture.

// gl is the WebGLRenderingContext 
const texture = gl.createTexture()
gl.activeTexture(gl.TEXTURE0 + textureIndex);
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, img);

// more info about these parameters in the webglfundamentals
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
Enter fullscreen mode Exit fullscreen mode

The image data you pass to texImage2D can be an img element, a video element, ImageData, and more.

Because the image data of the video constantly changes, you will have to update the texture inside the requestAnimationFrame animation loop. This is done via texSubImage2D.

gl.texSubImage2D(gl.TEXTURE_2D, 0, 0, 0, gl.RGBA, gl.UNSIGNED_BYTE, video);
Enter fullscreen mode Exit fullscreen mode

Reading texture data inside your shader code

You can access pixel data of the texture via the texture2D glsl function.

Texture coordinates go from (0, 0) to (1, 1). The image is upside down. Also I'm mirroring the image horizontally (like a selfie cam does).

uniform sampler2D texture0;

void main() {
  vec2 coord = 1.0 - gl_FragCoord.xy / vec2(width, height);
  gl_FragColor = texture2D(texture1, coord);
}
Enter fullscreen mode Exit fullscreen mode

Accessing the Webcam

To get image data from the webcam, we can use a video tag and
use the getUserMedia API:

function accessWebcam(video) {
  return new Promise((resolve, reject) => {
    const mediaConstraints = { audio: false, video: { 
        width: 1280, 
        height: 720,
        brightness: {ideal: 2} 
      }
    };

    navigator.mediaDevices.getUserMedia(
      mediaConstraints).then(mediaStream => {
        video.srcObject = mediaStream;
        video.setAttribute('playsinline', true);
        video.onloadedmetadata = (e) => {
          video.play();
          resolve(video);
        }
      }).catch(err => {
        reject(err);
      });
    }
  );
}

// USAGE:
// const video = await accessWebcam(document.querySelector('video'));
// or via promises:
// accessWebcam(document.querySelector('video')).then(video => { ... });
Enter fullscreen mode Exit fullscreen mode

To access the webcam, you can use the getUserMedia API to access the webcam like above.

Provide a Fallback Image

If the access to the webcam is blocked by the user, or no webcam is available, you can provide a fallback image to be used instead.

I also wrapped new Image() and its onload event into a promise.

function loadImage(url) {
  return new Promise((resolve, reject) => {
    const img = new Image();
    img.crossOrigin = 'Anonymous';
    img.src = url;
    img.onload = () => {
      resolve(img);
    };
    img.onerror = () => {
      reject(img);
    };
  });
}
Enter fullscreen mode Exit fullscreen mode

Putting it all together

To make things a little easier, I've put commonly used WebGL functions into a tiny helper library I made: GLea.

It initializes the WebGL context, compiles the WebGL shader code and creates attributes and buffers for the vertex shader:

By default, a position attribute is provided for the vertex shader with a buffer containing 4 2D coordinates, spanning up a strip of 2 triangles filling the whole screen.

import GLea from 'glea.js';

const frag = ` ... `; // fragment shader code
const vert = ` ... `; // vertex shader code
const glea = new GLea({
  shaders: [
    GLea.fragmentShader(frag),
    GLea.vertexShader(vert)
  ]
}).create();

function loop(time = 0) {
  const { gl, width, height } = glea;
  glea.clear();
  glea.uniV('resolution', [width, height]);
  glea.uni('time', time * 1e-3);
  gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
  requestAnimationFrame(loop);
}

window.addEventListener('resize', () => {
  glea.resize();
});
loop(0);
Enter fullscreen mode Exit fullscreen mode

Conclusion

That's basically it. I hope you enjoyed reading the article and get curious about exploring WebGL yourself. I'll put some resources in here.

If I haven't covered something, please feel free to comment =).

Resources

Top comments (2)

Collapse
 
benjaminwegener profile image
@BenjaminWegener

thank you very much, for your effort, it helped me a lot porting AMDs CAS to webgl... github.com/BenjaminWegener/AMD-fid...

Collapse
 
puleta profile image
Puleta

It's a really interesting article, thanks for sharing. I've never worked with WebGL, but I must try it now, I didn't know that you can do interesting stuff like this. :)