Suspicious of "DC dimming". If you can just lower the current to an LED to dim it, everyone would. Someone will know better than me, but I believe there is a kind of threshold voltage for the (solid-state) LED.
I am not aware of LED bulbs (and here I am talking about home lighting, not phones or laptops) that dim by shutting down some of the (multiple) LEDs.
Most home lighting bulbs appear to have several LED elements. A circuit could enable dimming by simply shutting some of them off — running the rest full-on. 50% dim would of course shut half the LEDs off. No PWM required.
DC dimming LEDs is relatively easy, and somewhat common. The problem is that it's expensive compared to PWM dimming. It requires more expensive current-adjustable circuitry.
Additionally, for bulbs that are used in regular household fixtures, they basically need a way to convert TRIAC chopped 50/60Hz AC into constant current... which makes things even more expensive. Smart bulbs that are supplied a constant non-chopped AC can do it easier, but it's still expensive to do DC dimming.
I guess there is some threshold below which the LED turns off so the voltage/current -> light function needs to be set accordingly.
When I was in high school we were messing around with liquid nitrogen and overvolting LEDs and noticed the odd effect that the color of the LED would change if you overvolt it. It was years before I found out why
Voltage, yes. Current, no not really. You can drive extremely low currents and still get photon emissions from LEDs. That said, it's highly non-linear, so you basically need to assign set points. Doubling the current won't double the lumen output.
You can just lower the current. Not everyone does because it generally requires more expensive components, e.g. inductors. There is a threshold voltage ("forward voltage") needed for LEDs to turn on but there's no threshold for minimum radiant flux. LEDs are actually more efficient at low current (although this might be counteracted by greater losses in the power supply).
you can dim LED that are running on DC (it requires more than a potentiometer i guess - probably a buck circuit controlled by a pot, though) or AC; i have scant idea how the AC ones work, although variacs have existed for a real long time; but you have to buy special LED bulbs that can handle being on a dimming circuit.
this is different than a bulb like hue etc that have the ability to dim themselves through whatever mechanism.
Traditional dimmers used TRIACs. Those don't dim LEDs well, they make very visible flicker. TRIACs turn the AC off for part of the waveform, essentially a very slow version of PWM. With an incandescent filament that flicker isn't as noticeable since it takes some time to cool down & stop glowing, which visibly smooths the flicker. It just stabilizies around a lower temperature. With LEDs, the turn-off is nearly instant. You visibly see the flicker at the AC mains frequency.
There are two ways to dim an LED: supply less current at the same voltage, or PWM dim it with a fast enough switching speed that you don't notice the flicker (this being slower than it needs to be is what the article is about). A current source is pretty easy to build, and doesn't flicker, but it does dissipate all the excess energy as heat. That's not what you want inside the dimmer switch in your wall, it can be quite a lot of heat and would be a fire hazard in such a confined area. It does work for things like photography lamps which can have exterior heat sinking.
> but it does dissipate all the excess energy as heat.
No. That's only true for a linear regulator, which is just one, very terrible, implementation of a current source that's only used for very low power applications. Linear regulators are never used for things like room illumination.
The alternative, and what's used for all commercially available DC LED drivers (plentiful and cheap), is to just use a regular AC->DC switching supply in current mode (current for feedback rather than voltage feedback). The only flicker is the ripple left in the filtered output.
Why aren't these used? Because most dimmer switches use tech from incandescent age, and just chop off parts of the AC sine wave, so the bulbs are designed around the switches you can buy in the store. Why do dimmer switches chop? Because that's what the bulbs you can buy at the store expect, sometimes damaging them if not dimmed as they expect.
You can buy in wall DC dimmer switches from any LED supply store, but they require DC lighting, also only found at LED supply stores. It's entirely a very recent momentum problem, that's slowly going away.
You're correct. You can't use a linear regulator for dimming. A current-mode switching DC/DC converter works, but needs sufficient filtering (or high enough switching frequency) to avoid the flicker issue.
Linear regulators are in fact used for room lighting, and efficiency can be reasonably good. Typical design is AC input -> bridge rectifier -> passive low-pass filter -> long string of LEDs with a single linear regulator in series. Voltage drop across the regulator is much lower than across the string of LEDs so there's not a whole lot of heat generated.
You can dim LEDs running on AC by converting to DC and then adjusting the current limit of the switching power supply. No flicker, but more expensive components.
It takes 1 mosfet to turn led on/off from a MCU GPIO, but if you want to do DC dimming, now you have to either add more passive components, or turn to special IC, both cost more.
I am not aware of LED bulbs (and here I am talking about home lighting, not phones or laptops) that dim by shutting down some of the (multiple) LEDs.
Most home lighting bulbs appear to have several LED elements. A circuit could enable dimming by simply shutting some of them off — running the rest full-on. 50% dim would of course shut half the LEDs off. No PWM required.