I understand that this would be true for any other regular LED.
The doc states that it accepts +3.5~+5.3 V.
I was wondering if it is somehow internally converted to operate at the same level as long as power supply is in the nominal range. Will 0xAA0000 for 3.5V give out the same brightness as 0xAA0000 for 5V
2 Answers
I'm not sure if it is too optimistic to assume that the chip designers are aware of the importance of driving the LEDs with constant current.
Anyway. From the datasheet:
It mentions a programmable constant-current control (don't understand the 12V part, though. Maybe a typo?) to ensure consistent brightness.
-
\$\begingroup\$ I guess the only way to know is probably to test it. \$\endgroup\$somerandomusername– somerandomusername2024-05-17 09:07:06 +00:00Commented May 17, 2024 at 9:07
The ones I have used have always changed brightness with supply voltage. As the other answer notes the datasheet discusses this but it's not clear exactly what it means. I think it could mean that it maintains equivalent brightness on the red, green and blue LEDs to produce something which appears white across the permissible supply voltage range.
For an empirical view of this, I wrote an article about my recent testing from 2.3V to 5.0V, Instructables: Testing an RGB LED Matrix With Different Supply Voltages. I've included a few screenshots of the testing below which show the voltage across LEDs on left and current on the right. These are using 0x3a3a3a on 13 (of the 64) pixels at 2.3V (out of spec. and forward voltage of blue LED can't be surmounted), 3.3V (out of spec.), 4.0V and 5.0V. You can see the current (and brightness) changes significantly for all of these values. The current includes powering the microcontroller/etc. The camera was set to a fixed exposure and daylight white balance.




