1

I'm trying to read a depth value from a FBO through gl.readPixels.

const canvas = document.getElementById("glcanvas");
const gl = canvas.getContext("webgl2");
 
const width = 64;
const height = 64;

const fbo = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, fbo);

// COLOR ATTACHMENT
const color = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, color);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA8, width, height, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, color, 0);

// DEPTH ATTACHMENT
const depth = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, depth);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.DEPTH_COMPONENT32F, width, height, 0, gl.DEPTH_COMPONENT, gl.FLOAT, null);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_COMPARE_MODE, gl.NONE);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.TEXTURE_2D, depth, 0);

// REQUIRED ON SOME PLATFORMS
gl.drawBuffers([gl.COLOR_ATTACHMENT0]);


// CLEAR DEPTH + COLOR
gl.clearDepth(0.5);
gl.clearColor(0, 0, 0, 1);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

// READ DEPTH PIXEL
const depthPixel = new Float32Array(1);
gl.readPixels(0, 0, 1, 1, gl.DEPTH_COMPONENT, gl.FLOAT, depthPixel);

I have tried this minimal example on a range of cards and browsers and gl.readPixels invariably fails with:

WebGL: INVALID_ENUM: readPixels: invalid format

Moreover, on some rather modern card, even gl.getExtension("WEBGL_depth_texture") returns null.

Is this behavior known/expected?

I might be suffering from false memories but I remember being able to read depth from a backbuffer even back in 2012 on WebGL 1.0 with no issue on all kind of platforms and GPUs.

2
  • 1
    To my knowledge this is illegal/usupported, yes. I asked a similar question in 2021: stackoverflow.com/questions/67710054 . However you can still read the depth values through an additional render pass, where you read the depth texture from a shader and write it into a colour output. That output you can then read into js using readPixels. Commented Jun 16 at 9:12
  • 1
    @Berthur right, it seems we've come full circle: I remember when with webgl 1.0 it was just impossible to read from the depthbuffer and the trick was rendering depth to a separate render target. Then an extension allowed that. And now with webgl 2.0 we're back to unreadable depth although it's in the spec. too bad Commented Jun 17 at 10:34

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.