0

I use https://webgpu.github.io/webgpu-samples/

I wanna integrate simple texture to the https://webgpu.github.io/webgpu-samples/?sample=shadowMapping example.

loadTex (this.texture0)

  async loadTex0(texturesPaths, device) {
    return new Promise(async (resolve) => {
      const response = await fetch(texturesPaths[0]);
      const imageBitmap = await createImageBitmap(await response.blob());

      this.texture0 = device.createTexture({
        size: [imageBitmap.width, imageBitmap.height, 1],
        format: 'rgba8unorm',
        usage:
          GPUTextureUsage.TEXTURE_BINDING |
          GPUTextureUsage.COPY_DST |
          GPUTextureUsage.RENDER_ATTACHMENT,
      });
      var texture0 = this.texture0
      device.queue.copyExternalImageToTexture(
        {source: imageBitmap},
        {texture: texture0},
        [imageBitmap.width, imageBitmap.height]
      );
      resolve()
    })
  }

Fow begin i just put

      this.sceneBindGroupForRender = this.device.createBindGroup({
        layout: this.bglForRender,
        entries: [
          {
            binding: 0,
            resource: {
              buffer: this.sceneUniformBuffer,
            },
          },
          {
            binding: 1,
            resource: this.shadowDepthTextureView,
          },
          {
            binding: 2,
            resource: this.device.createSampler({
              compare: 'less',
            }),
          }, // ADDED HERE
          {
            binding: 3,
            resource: this.texture0.createView(),
          },
        ],
      });

I need shaders also to be updated... I not sure is it posible some inject intro texture_depth_2d:

@group(0) @binding(1) var shadowMap: texture_depth_2d;
@group(0) @binding(2) var shadowSampler: sampler_comparison;

OR i need to add new tec sample and texture and then in main func to call something like color.rgb += texture

      this.sceneBindGroupForRender = this.device.createBindGroup({
        layout: this.bglForRender,
        entries: [
          {
            binding: 0,
            resource: {
              buffer: this.sceneUniformBuffer,
            },
          },
          {
            binding: 1,
            resource: this.shadowDepthTextureView,
          },
          {
            binding: 2,
            resource: this.device.createSampler({
              compare: 'less',
            }),
          },
 
          {
            binding: 3,
            resource: this.sampler,
          },
          {
            binding: 4,
            resource: texture.createView(),
          },

        ],
      });
export let fragmentWGSL = `override shadowDepthTextureSize: f32 = 1024.0;

struct Scene {
  lightViewProjMatrix : mat4x4f,
  cameraViewProjMatrix : mat4x4f,
  lightPos : vec3f,
}

@group(0) @binding(0) var<uniform> scene : Scene;
@group(0) @binding(1) var shadowMap: texture_depth_2d;
@group(0) @binding(2) var shadowSampler: sampler_comparison;

@group(0) @binding(3) var meshSampler: sampler;
@group(0) @binding(4) var meshTexture: texture_2d<f32>;

struct FragmentInput {
  @location(0) shadowPos : vec3f,
  @location(1) fragPos : vec3f,
  @location(2) fragNorm : vec3f,
}

const albedo = vec3f(0.9);
const ambientFactor = 0.2;

@fragment
fn main(input : FragmentInput) -> @location(0) vec4f {
  // Percentage-closer filtering. Sample texels in the region
  // to smooth the result.
  var visibility = 0.0;
  let oneOverShadowDepthTextureSize = 1.0 / shadowDepthTextureSize;
  for (var y = -1; y <= 1; y++) {
    for (var x = -1; x <= 1; x++) {
      let offset = vec2f(vec2(x, y)) * oneOverShadowDepthTextureSize;

      visibility += textureSampleCompare(
        shadowMap, shadowSampler,
        input.shadowPos.xy + offset, input.shadowPos.z - 0.007
      );
    }
  }
  visibility /= 9.0;

  let lambertFactor = max(dot(normalize(scene.lightPos - input.fragPos), normalize(input.fragNorm)), 0.0);
  let lightingFactor = min(ambientFactor + visibility * lambertFactor, 1.0);

  let textureColor = textureSample(meshTexture, meshSampler, input.shadowPos.xy);


  return vec4(textureColor.rgb * lightingFactor * albedo, 1.0);
}`

LOG :

Binding doesn't exist in [BindGroupLayout].
 - While validating that the entry-point's declaration for @group(0) @binding(3) matches [BindGroupLayout]
 - While validating the entry-point's compatibility for group 0 with [BindGroupLayout]
 - While validating fragment stage ([ShaderModule], entryPoint: ).
 - While validating fragment state.
 - While calling [Device].CreateRenderPipeline([RenderPipelineDescriptor]).

LAST LOG

The sample type in the shader is not compatible with the sample type of the layout.

None of the supported sample types (Float|UnfilterableFloat) of [Texture] match the expected sample types (Depth).
 - While validating entries[3] as a Texture.
Expected entry layout: { binding: 3, visibility: ShaderStage::(Vertex|Fragment), texture: { sampleType: TextureSampleType::Depth, viewDimension: TextureViewDimension::e2D, multisampled: 0 } }
 - While validating [BindGroupDescriptor] against [BindGroupLayout]
 - While calling [Device].CreateBindGroup([BindGroupDescriptor]).

Any suggestion ?

1 Answer 1

0

I follow warnings and fix types in bind layout call. I found on https://www.w3.org/TR/webgpu/:

enum GPUSamplerBindingType {
    "filtering",
    "non-filtering",
    "comparison",
};

Also 'float' for texture. Correct way looks :

          {
            binding: 3,
            visibility: GPUShaderStage.VERTEX | GPUShaderStage.FRAGMENT,
            texture: {
              sampleType: 'float',
            },
          },
          {
            binding: 4,
            visibility: GPUShaderStage.VERTEX | GPUShaderStage.FRAGMENT,
            sampler: {
              type: 'filtering',
            },
          },

In shader i use input.shadowPos.xy like vec2 to satisfait textureSample.

let textureColor = textureSample(meshTexture, meshSampler, input.shadowPos.xy);

And works fine !

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.