jpp
November 13, 2025, 10:54am
1
I’m trying to use three.js with WebGPU instead of WebGL, but I’m having a problem with lights.
If I create light sources by calling THREE.AmbientLight() and THREE.DirectionalLight(), it works fine:
https://www.dei.isep.ipp.pt/~jpp/webGPU/cube1/cube.html
But if I decide to create them by calling my own classes - AmbientLight() (which extends THREE.AmbientLight()) and DirectionalLight() (which extends THREE.DirectionalLight()):
https://www.dei.isep.ipp.pt/~jpp/webGPU/cube2/cube.html
the scene doesn’t get lit and I get the following warning:
three.core.js:1770 THREE.LightsNode.setupNodeLights: Light node not found for AmbientLight
I’m using the most recent three.js revision (r181).
Could you please help? Thanks in advance.
Mugen87
November 13, 2025, 10:12pm
2
You should find the solution for this issue here:
opened 06:19AM - 08 May 25 UTC
closed 09:29AM - 08 May 25 UTC
Question
WebGPU
### Description
Sometimes we inherit from Three.js's light classes to create a … subclass with additional custom functions. This works fine in WebGL, but causes issues in the WebGPU renderer.
WebGPU relies on the class's constructor to locate the lightNode, and when we create a new subclass via inheritance, this process breaks.
Is this considered a bug?
### Reproduction steps
empty
### Code
empty
### Live example
empty
### Screenshots
_No response_
### Version
175
### Device
_No response_
### Browser
_No response_
### OS
_No response_
2 Likes