Steam Deck OLED sees burn-in after 1,500-hour stress test — reducing brightness recommended to avoid damage

Steam Deck OLED
(Image credit: Valve)

The Steam Deck OLED showed minor burn-in after a 1,500-hour stress test conducted by YouTuber Wulff Den. OLED displays are inherently susceptible to burn-in, and the Steam Deck OLED is no exception. While 1,500 hours or 63 days is a long time, the OLED model of the Deck experienced burn-in much quicker than the Switch OLED, which Wulff Den demonstrated experiencing burn-in at around 3,500 hours.

Since their inception, image retention, known colloquially as burn-in, has been a critical issue with OLED displays. After all, static images or static portions of moving content (like UI elements or TV channel logos) are common. The latest OLED panels, however, come with newer designs that are supposed to mitigate the risk of burn-in.

Wulff Den decided to test the Steam Deck OLED and compare it to a Switch OLED that he had previously tested. The comparison wasn't made purely because they're similar products, as the two devices on the sub-pixel level are almost identical and may even come from the same manufacturer.

For the test, an in-game screenshot of The Legend of Zelda: Breath of the Wild was put on the screen (same as he did for the Switch OLED) with some color testing bars at the top since there was extra room on the Deck's larger display. Additionally, he set the brightness to 100%, and since the screenshot and color bar were in SDR, the brightness was roughly 600 nits, though the Deck can do 1,000 nits in HDR.

The test was stopped at the 1,500-hour mark, and there were faint but certainly noticeable signs of burn-in. The black-white bar pattern was the most visible due to the high contrast, which isn't surprising. The colors quickest to burn in on the sub-pixel level were red and blue, leaving green as the most resilient color. Size is an important factor in burn-in risk, and since the red sub-pixels are the smallest on the Deck's OLED display, that they burned in isn't surprising. However, the blue sub-pixels are the largest, yet they burned in the most, according to Wulff Den.

As for the UI elements in the Zelda screenshot, the only one that seemed to stick were the hearts at the top-left corner. Since the hearts are pure red, it makes sense that they would be the one part of the UI that caused burn-in. That being said, the burn-in was pretty minor and much harder to discern than the burn-in caused by the color bars.

It's unclear why the Steam Deck OLED experienced burn-in much quicker than the Switch OLED, but Wulff Den speculates that brightness could be the cause. The Switch OLED caps out at 400 nits, and while the OLED Deck is technically only 50% brighter, more brightness requires exponentially more power. Wulff Den also cites a test conducted by fellow YouTuber The Phawx, who tested burn-in with HDR content at 1,000 nits and saw clear burn-in after 750 hours; only 67% brighter to achieve worse burn-in within half the time.

Wulff Den reached out to Valve for comment on whether the OLED Deck employed any anti-burn-in technologies, such as adjusting the UI incrementally over time, and Valve said it did not. However, Valve said it wasn't aware of any burn-in issues experienced by users "under normal use" and that the one-year warranty would cover burn-in.

Although burn-in can be concerning, Wulff Den concludes that the risk can easily be mitigated by reducing brightness and disabling HDR, saying users should "be careful blasting your brightness, this thing can get bright. 600 nits is no joke, 1,000 nits in HDR is just ridiculous." Playing on an external monitor is another option, especially for users who are sinking hundreds of hours into the same game.

Matthew Connatser

Matthew Connatser is a freelancing writer for Tom's Hardware US. He writes articles about CPUs, GPUs, SSDs, and computers in general.

  • Alvar "Miles" Udell
    Or just don't worry about it.

    In January an article was published stating it was seen at 750 hours. Given how both cases are far shorter than the OLED Switch's time, I would definitely worry about the expected lifespan of a $95-$150 replacement part.

    https://www.tomsguide.com/gaming/steam-deck-oled-is-vulnerable-to-burn-in-but-only-if-you-use-it-wrong
    Reply
  • newtechldtech
    OLED is bad ... not only it has the burn in problem , it is not bright enough, and it displays less colours when you change the brightness . The future is for Micro LED .. Thats why Sony Ditched OLED altogether for their new TV line up.
    Reply
  • redgarl
    There is a reason why Nintendo didn't plan to release the Switch 2 with an OLED display.
    Reply
  • anonymousdude
    newtechldtech said:
    OLED is bad ... not only it has the burn in problem , it is not bright enough, and it displays less colours when you change the brightness . The future is for Micro LED .. Thats why Sony Ditched OLED altogether for their new TV line up.

    You mean their Crystal LED TV wall that they showed at CES? That's Micro LED which is still far from being mainstream.

    Rumor has it Sony is pivoting back to mini LED for their next flagship TV. They're probably confident enough that they can deal with blooming, which honestly means they just upped the number of dimming zones significantly and figured out the accompanying processing. They're also probably also tired of buying panels from LG and Samsung cause it likely isn't good for their margins. Probably much less to do with the characteristics like brightness, colors, contrast, response times, etc which are still just tradeoffs between OLED and miniLED.
    Reply
  • Armbrust11
    anonymousdude said:
    You mean their Crystal LED TV wall that they showed at CES? That's Micro LED which is still far from being mainstream.

    Rumor has it Sony is pivoting back to mini LED for their next flagship TV. They're probably confident enough that they can deal with blooming, which honestly means they just upped the number of dimming zones significantly and figured out the accompanying processing. They're also probably also tired of buying panels from LG and Samsung cause it likely isn't good for their margins. Probably much less to do with the characteristics like brightness, colors, contrast, response times, etc which are still just tradeoffs between OLED and miniLED.
    Micro LED is the future. But probably 10 years until mainstream and another 10 to become ubiquitous. For devices like the switch, people are likely to want to keep using them for decades, unless Nintendo commits to backwards compatibility. For other devices, OLED's consumable nature is less of an issue, but still a drawback that some people underestimate and others overestimate. Personally I'm in the latter camp.
    Reply
  • dan42078
    redgarl said:
    There is a reason why Nintendo didn't plan to release the Switch 2 with an OLED display.
    Two reasons actually, and neither are burn in related. Cost is one, mid generation upgrade potential is two. Burn in is a non issue on the Switch OLED that currently exists, no reason it would become an issue on the next Switch
    Reply
  • purposelycryptic
    newtechldtech said:
    OLED is bad ... not only it has the burn in problem , it is not bright enough, and it displays less colours when you change the brightness . The future is for Micro LED .. Thats why Sony Ditched OLED altogether for their new TV line up.
    OLED is great - just not for all usage cases.

    I certainly wouldn't get an OLED monitor for productivity or design work, for example; too many static UI elements, too much pure white for too long, etc. It's just a bad match.

    With gaming, it depends - most newer games have had their UI designed with considerations for OLED, so static elements are frequently set to around 50% transparency, which, combined with the built-in burn-in countermeasures in the display, is usually enough to make it a non-issue. But, if you are playing an older game, like say, Baldur's Gate, where the screen is just chock-full of static UI elements, it's a bad fit.

    In general, I, personally, just wouldn't buy an OLED monitor - I just don't like having to worry about burn-in on any level, nor having to adapt my usage to my display.

    I had a final-generation Plasma TV that did a lot of monitor duty in the living room, and the fear of burn-in was always in the back of my mind. And it did end up with significant burn-in over its eight years of service (it was on at least 10 hours a day, much of it in Windows, as I worked from home), but it was still only visible on pure color calibration images.

    Still, I switched to a 75" MiniLED TV when I finally upgraded, rather than an OLED, simply because I knew there was zero chance of burn in, and it's full screen max brightness is seriously impressive even in a bright living room. Newer OLEDs can get bright, sure, but only for 10% of the screen, for a short period. Great for HDR content, but can't compete with MiniLED if you need your whole screen to be pretty bright because of the lighting in the room.

    I do have a 48" LG C1 OLED in my study as one of my monitors, but it is used strictly for gaming. It used to be my bedroom TV, but I moved into a bigger house with a bigger bedroom, and it was just too small for the room size, so I moved one of my 65" MiniLED TVs in there instead.

    Watching TV, movies, or playing current-gen console games in a darker room, though, an OLED screen still blows MiniLED screens out of the water. I love the ones I have, and wouldn't trade them for OLEDs, but, used as a pure "TV" TV, as long as the room isn't too bright, they really are great.

    To address your point on MicroLED screens - they are absolutely wonderful... if you can afford to pay the price of a brand-new car for an early-adopter luxury TV that won't look nearly as good as later consumer models that will cost a fraction of the price.

    MicroLED is the future, for sure. But that future isn't here yet, and won't fully arrive for a decent while yet.

    Comparing MicroLED to OLED today is like comparing current EV battery technology to one of the next-generation technologies every manufacturer is frantically researching to get out the door at a price people will pay. One of those technologies is actually available, and you can simply go to a store and walk out with a unit that uses it. The other is still very much in the development stage.

    Of course future technology will be better - that's the whole point of developing it in the first place. But it also isn't an option unless you have unreasonably large amounts of money to essentially throw away. Saying current technology can't measure up to future technology is like saying water is wet; we all know this, but it's still a completely pointless thing to say, because it won't make that future technology arrive any sooner.
    Reply
  • newtechldtech
    Armbrust11 said:
    Micro LED is the future. But probably 10 years until mainstream and another 10 to become ubiquitous. For devices like the switch, people are likely to want to keep using them for decades, unless Nintendo commits to backwards compatibility. For other devices, OLED's consumable nature is less of an issue, but still a drawback that some people underestimate and others overestimate. Personally I'm in the latter camp.

    Not 10 years , is expected to be 75% less starting from 2027 ...

    https://www.tomsguide.com/news/microled-tvs-are-finally-dropping-in-price-heres-when-you-might-be-able-to-get-one
    Reply
  • Armbrust11
    newtechldtech said:
    Not 10 years , is expected to be 75% less starting from 2027 ...

    https://www.tomsguide.com/news/microled-tvs-are-finally-dropping-in-price-heres-when-you-might-be-able-to-get-one
    75% cheaper than 100k is still $25,000. It will have to get another 75% cheaper to be accessible outside professional displays and another 75% cheaper to become affordable enough for a splurge purchase. And 75% cheaper still to render all other display technology effectively obsolete.

    2018 (first unveiled) to 2027 is almost 10 years so I'd say my rough ballpark estimate was pretty on point, assuming that pace is maintained. I just hope MicroLED breaks out of the professional market since some technology never becomes affordable enough for mainstream users.
    Reply
  • newtechldtech
    Armbrust11 said:
    75% cheaper than 100k is still $25,000. It will have to get another 75% cheaper to be accessible outside professional displays and another 75% cheaper to become affordable enough for a splurge purchase. And 75% cheaper still to render all other display technology effectively obsolete.

    2018 (first unveiled) to 2027 is almost 10 years so I'd say my rough ballpark estimate was pretty on point, assuming that pace is maintained. I just hope MicroLED breaks out of the professional market since some technology never becomes affordable enough for mainstream users.
    100k is for 100 inch micro led ... even by today standards a 100 inch normal OLED TV costs around $25K (20K after discount)

    for example the LG 97 inch oled TV here

    https://www.lg.com/us/tvs/lg-oled97g2pua-oled-tv
    so a $25K for 100inch microled TV will be the same price of OLED today and far superior .


    If you read carefully today 10 inch microled panel is $5K , 75% cheaper that would be $1.25K per 10 inch module , for a 30 inch gaming micro led Monitor you will need to stack 6 (two rows three columns) of them with total cost for around $7.5K
    Reply