Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t understand what’s wrong with using red, green, and blue LEDs at the edge of the monitor. It should be more energy efficient, since phosphors are inherently lossy. It’ll drastically reduce the accuracy of colors rendered by “white” light coming from white images displayed, but I don’t think users will care much.


The red, green, and blue LEDs will age at different rates. I remember reading that RGB was used in the early days of using LEDs for interior lighting, and that software was being used to compensate and at the same time enable the feature of choosing the white balance.

A paper that discusses these things: http://focs.eng.uci.edu/papers%20for%20GaN/White%20LED/Red,%...


There are deployed lighting systems that internally measure the LED output to compensate for aging. Lumenetix makes one, for example.


Edit: I read the paper you linked. It’s pretty old (2002), and it indeed discusses closed-loop control. Perhaps the technology is just a bit too expensive to put in a monitor.

P.S. the paper also suggests PWM-driving an LED at 120 Hz. This is a horrible thing to do. The frequency needs to be much higher to avoid giving people headaches.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: