I'm trying to read a depth value from a FBO through gl.readPixels.
const canvas = document.getElementById("glcanvas");
const gl = canvas.getContext("webgl2");
const width = 64;
const height = 64;
const fbo = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, fbo);
// COLOR ATTACHMENT
const color = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, color);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA8, width, height, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, color, 0);
// DEPTH ATTACHMENT
const depth = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, depth);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.DEPTH_COMPONENT32F, width, height, 0, gl.DEPTH_COMPONENT, gl.FLOAT, null);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_COMPARE_MODE, gl.NONE);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.TEXTURE_2D, depth, 0);
// REQUIRED ON SOME PLATFORMS
gl.drawBuffers([gl.COLOR_ATTACHMENT0]);
// CLEAR DEPTH + COLOR
gl.clearDepth(0.5);
gl.clearColor(0, 0, 0, 1);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
// READ DEPTH PIXEL
const depthPixel = new Float32Array(1);
gl.readPixels(0, 0, 1, 1, gl.DEPTH_COMPONENT, gl.FLOAT, depthPixel);
I have tried this minimal example on a range of cards and browsers and gl.readPixels invariably fails with:
WebGL: INVALID_ENUM: readPixels: invalid format
Moreover, on some rather modern card, even gl.getExtension("WEBGL_depth_texture") returns null.
Is this behavior known/expected?
I might be suffering from false memories but I remember being able to read depth from a backbuffer even back in 2012 on WebGL 1.0 with no issue on all kind of platforms and GPUs.