Actually a different issue. With CRTs and plasma, burn-in would actually burn an image into the phosphors of the screen.
OLEDs will slowly burn out after continued use, usually starting with the blue LEDs, giving a similar appearance, but not exactly the same (for example, you can only see the image persistence when displaying an image).
LCDs don’t have either issue, but sometimes crystals can get stuck in a specific orientation, leading to (usually temporary) image retention. Gets worse in very cold weather. This is rare, though.
Serious question, why not? Do they produce some harmful flicker or something?
burn in with static ui elements - The idea is, that regular very different images reduces the risk.
Static UI elements you say?
Like games have?
Yea, but people don’t play the same game non-stop for 8 hours… actually, never mind.
Some screens have pixelshifting to mitigate this. Also, burn in doesn’t happen as quickly as you might think.
Now, you wouldn’t want to go using an OLED for a billboard.
So the same issue we’ve always had with crt, lcd, and plasma.
Actually a different issue. With CRTs and plasma, burn-in would actually burn an image into the phosphors of the screen.
OLEDs will slowly burn out after continued use, usually starting with the blue LEDs, giving a similar appearance, but not exactly the same (for example, you can only see the image persistence when displaying an image).
LCDs don’t have either issue, but sometimes crystals can get stuck in a specific orientation, leading to (usually temporary) image retention. Gets worse in very cold weather. This is rare, though.
Doesn’t QD OLED fix this? Because the actual OLED part of it is only the backlight, then color filters produce the other colors
I don’t have personal experience with it, so I don’t want to talk out of my ass.