What is Nostr?
Hector Martin /
npub1qk9…azpx
2024-06-23 07:56:36

Hector Martin on Nostr: Are there any double-blind studies on sensitivity to temporal dithering in LCDs? ...

Are there any double-blind studies on sensitivity to temporal dithering in LCDs?

There's a whole subcommunity of people that swear that it hurts their eyes, and some of them have asked whether we can do something about it in Asahi. Maybe yes, maybe no. But here's the thing: Their stories invariably don't add up. At all.

Sure, there's people that have found ways to toggle it off on and on on some systems (with high FPS camera proof). And they'll swear it makes a difference. But I've seen nobody do a proper self-blinded test. Instead you get stories that don't make sense, like how OS updates somehow make a difference (unlikely, since dithering is *required* to achieve color depth targets, not something a manufacturer would just change on a whim) or how machines with by all appearances identical displays have wildly different effects on them, or people who think they can control dithering on an external monitor with embedded video processing from your GPU/display settings (you can't) and it solves their problem.

I've seen people report on eyestrain effects of over a dozen machine/OS version combinations and the resulting matrix is just completely random and uncorrelated with any dimension it would make sense for it to be correlated with.

To me, it all sounds like the audiophile "I can hear ultrasound frequencies" or the "radio waves make me ill" crowd. Sure, there are a myriad causes of eye strain, and I have no problem believing that things like crappy low-frequency PWM backlights are bad for some people. But single-bit dithering of pixel values, really? That's a minute brightness difference, and it's usually checkerboarded so the average screen brightness is constant. Eyestrain is real, and there's a lot of people that swear it has to do with dithering... but I've yet to see any proof by scientific standards that those two are related.

(Then there's the people claiming that manufacturers are somehow lying about color depth because they use dithering, which is of course nonsense. Every modern audio amp in a smart device is a Class D amp with *1-bit* dithered output. Dithering is a legitimate signal processing technique, and just because an amp's output is 100% dithering doesn't mean it doesn't get to claim 16 to 24 bits of resolution).

It would *not* be hard to design a self-administered test for this. Just use a machine where the dithering is known to be controllable, use a high-detail, pre-spatially-dithered image as a test subject (so you can't cheat by looking for banding), and always cycle the screen to off while toggling dithering to ensure no hints due to glitches during the setting change. Then after each cycle you have to say whether dithering is on or off (looking at the screen from normal viewing distance, no getting up close or using a magnifier!). You could easily write a program to self-administer a test like this. I bet most people would do no better than random chance.
Author Public Key
npub1qk9x6yrvten3jqyvundn7exggm90fxf9yfarj5eaz25yd7aty8hqe9azpx