1

As far as I know, it is possible to use Uint8Array as a texture in WebGL. But how is it possible to pass a large Float32Array or Float64Array to the shader efficiently? The float values are not in the (0.0, 1.0) range, and may be negative too.

I know, some devices does not support high precision float, and it is not a problem, if some precision is lost during the process.

1

1 Answer 1

3

Try to enable floating point textures with code like

var ext = gl.getExtension("OES_texture_float");
if (!ext) {
  alert("sorry, no floating point textures");
  return;
}

Now you can create textures with Float32Array using

gl.texImage2D(target, level, internalformat, width, height, 0, 
              format, gl.FLOAT, someFloat32Array);

Note that being able to filter a floating point texture is a separate extension so if you need to filter you'll have to check for that too

var ext = gl.getExtension("OES_texture_float_linear");
if (!ext) {
  alert("sorry, can't filter floating point textures");
  return;
}

Otherwise you need to set texture filtering to NEAREST

Note: in WebGL2 you can read from floating point textures by default. but you use an internal type of gl.RGBA32F. Filtering is still optional so you need to try to enable OES_texture_float_linear.

To render to them (attach them to a framebuffer). In WebGL1 you need to check for WEBGL_color_buffer_float. In WebGL2 you need to check for EXT_color_buffer_float.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.