My take is that PWM dimmers are dramatically more energy efficient than the old rheostat dimmers people used to use. If you operate a transistor in a digital mode where it is either on or off it is close to 100% efficient, but if you operate it in a 50% power mode you have to send 50% of the power to a load and the other 50% to a resistor. Thus CMOS logic eradicated bipolar, switching power supplies replaced linear power supplies, a Class D amplifier can be a fraction the size of a Class A amplifier, etc.
You could probably still reduce the flicker by either increasing the switching frequency or putting some kind of filter network between the switch and the load.
For sure, they're definitely way more efficient. They just unfortunately give me migraines. I'd be open to trying some that have a filter network or some other smoothing on the flicker.
But I've also never lived in a house that has dimmers (they've all been old homes in the north eastern US) and I never use overhead lighting, so it's not something I need or would miss.
Aside from that Wikipedia article, where 1 source is not available and the other one is in Finnish, there's pretty much nothing online.
I googled for G4 LED tube PWM and got products that say they are G4 LED tubes that use PWM.
Pretty sure 100% of LED products sold anywhere use PWM if you don't use them at full brightness. I sometimes walk around lightning stores with a slo mo camera and see PWM in every price bracket.
It is always PWM under the hood, the question is, how much was spent (or not) on the filtering network out of the PWM. Is it closer to buck converter or is it straight up flicker at the output.
Since these things have lots of LEDs, my first thought was to put a range of different tiny delays on them to induce destructive interference, so that the off parts of one LED's flicker are the on parts of another, to smooth out the overall output.
Actually that's not true, my first thought was "just use a layer of phosphor excited by the LEDs", but fluorescent tubes do that and people used to make the same complaints about flicker, so.
Looks like "flicker index" is a useful(?) search term, anyway.
Have you ever tested various PWM frequencies? 50/60Hz is very noticable - but if the PWM is switching at 1000Hz? 5kHz? There is presumably a rate at which it is imperceptible to you?
Apparently Philips Hue uses 500-1000Hz. I wonder if there's manufacturers that use a much higher rate.
Beyond the Hz, the depth of the modulation matters. I am sensitive to poor PWM implementation, but Hue bulbs luckily don't bother me.
On an old iPhone with basic slow-mo recording capabilities, typical Hue bulbs don't "blink" when the video plays back, but the PWM-dimmed iPhone in the same video recording was blinking/flashing like crazy.
~~
Another example of the PWM details mattering: I can't use any iPhone with OLED (anything from the X to current), but I am able to use a Note9 which has OLED with DC-style PWM.
I was referring to the extent of brightness variation in the flicker -- while many focus on a higher frequency (Hz) to reduce eyestrain, the key factor is how the screen behaves during the off cycle
Some PWM implementations ramp the brightness up and down slightly (easier on eyes), while other manufacturers flip the switch on and off harshly (like strobing)
The shorter time the screen is dark between being lit up results in a a shorter pulse duration, and the pulse duration and depth are more important than the Hz
Old dimmers are triac based, with the potentiometer simply setting the trigger voltage, not doing the actual dimming. These were in fact very efficient.
Personally, I don't care for more energy efficiency if my head is hurting after 30 minutes under that light. I can really see all the flickering when I blink or move my head.
Similarly, I prefer a Class A amplifier if I have the space, but I won't open that can of worms here.
You could probably still reduce the flicker by either increasing the switching frequency or putting some kind of filter network between the switch and the load.