Screens keep getting faster. Can you even tell? | CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wo…::CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wonder how we ever put up with ‘only’ 240Hz displays?
I’m sticking out with IPS until MicroLED matures enough for me to afford.
OLED was never designed to be used as a computer monitor and I don’t want a monitor that only lasts a couple years.
Researchers just designed a special two layer (thicker than current OLED) that doubles the lifespan to 10,000hours at 50% brightness without degrading.
I’m totally with you on good HDR though. When it works, it’s as night -and-day as 60 -> 144hz felt for me.
Burn in is a non-issue for regular all-day use. As long as you aren’t displaying a static image at 100% for literally years and actively stopping the screen from running preventative measures, you’ll be fine.
Can desktop computers do those preventative measures? I haven’t seen any desktop interface for the mitigations Samsung puts on it’s phones.
Desktops also display static images 100% of the time, unless you change your usage behavior to use full screen all the time.
Why would oled only last 2 years?
It doesn’t only last for two years, however it begins to degrade after one year of illuminating blue. This would reduce the color accuracy.
However OLEDs are also very bad at color accuracy across it’s brightness range. Typically at lower brightness their accuracy goes out the window.
This isn’t as bad on smart phones ( smart phones also apply additional mitigations such as subpixel rotation) however desktop computers typically display static images for much longer and so not use these mitigations afaik.