Samsung's 500Hz Odyssey G6 OLED joins stacked 2025 gaming monitor lineup
Samsung's new Odyssey G6 OLED is purportedly the world's first 500Hz OLED monitor
Samsung has announced several new monitors ahead of CES 2025, featuring gaming and entertainment-focused variants. The highlights of Samsung's latest lineup are two new Odyssey G-series gaming monitors, purportedly featuring the world's first OLED panel with a 500Hz refresh rate.
The Odyssey G6 and G8 are Samsung's new high-performance gaming monitors, with the former featuring a blisteringly fast 500Hz refresh rate. The G8 variant, the G81SF, is a 27" OLED monitor equipped with a 4k panel offering a 240 Hz refresh rate. At 27 inches, the panel boasts a pixel density of 165 PPI. The G6 has a resolution reduction over its 4k counterpart, featuring a 1440p resolution. But, with that lower resolution comes more than a 2x refresh rate improvement.
It's worth mentioning that 500 Hz monitors, such as the AW2524HF, are already on the market. However, Samsung's G6 is purportedly the world's first to use an OLED panel.
Both monitors offer near instantaneous response times of 0.03ms thanks to OLED technology, and they feature AMD and Nvidia's variants of adaptive refresh rate tech (G-Sync Compatible and FreeSync Premium Pro). HDR 400 True Black is also supported, giving these new G6 and G8 models HDR capabilities in games and video content that supports HDR.
At CES, Samsung will also showcase a 3D-focused display, the Odyssey 3D G90XF. As the name suggests, this panel is focused on 3D gaming/entertainment but does not require glasses. The Odyssey 3D uses a new lenticular lens attached to the front panel to achieve the 3D look without 3D glasses. Samsung also equipped this particular Odyssey model with AI processing to convert 2D videos into 3D.
Hopping on the AI train, Samsung is introducing a new AI-focused monitor, the Smart Monitor M9, featuring "industry-first" AI features that purportedly enhance entertainment and interactivity through smart AI search functions and adaptation.
It is a 32" 4K OLED monitor with VESA DisplayHDR True Black 400 support, just like its Odyssey counterparts. The monitor features two AI functions: AI Picture Optimizer and AI Upscaling Pro. AI Picture Optimizer purportedly analyses input signals and optimizes the display's settings to optimize the visual experience for whatever content is being displayed. This includes games, productivity applications, and videos.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
AI Upscaling Pro upscales lower-resolution content to 4K. Essentially, this feature is similar to Nvidia DLSS but functions off the monitor itself and works with any content that is being displayed. Samsung uses a neural network to upgrade lower-resolution content to 4k quality.
Rounding out the lineup is the ViewFinity S8 S80UD, which is a productivity-optimized monitor featuring an unorthodox (but large) 37" screen. The panel type was not announced, but it probably uses an IPS panel due to the lack of any mention of OLED technology.
It features a 4K 16:9 panel with 99% sRGB color accuracy. A built-in KVM switch has also been integrated into the display, enabling users to operate several systems through the monitor without using different peripherals. There's also a 90W USB Type-C connector for charging supported devices.
Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.
-
A Stoner It really is unbelievable the rate of change there are in monitors these days.Reply
Refresh rates are great up to a point, and I am pretty certain that we are well beyond the point of not only diminishing returns, but beyond the point of any returns if we are past 300 hz.
I am much more interested in varying aspect ratios and curves which will fill far more people's needs than the eventual 1200 hz refresh monitors that have to falsify 90% of the frames they produce.
I already burned $3000 on monitors recently, and about the only thing that would cause me to replace my monitor setup anytime soon would be 128:27 75" 800R monitor. -
UnforcedERROR
It's less about the perception and more about the input lag once you get to a certain point. The returns are beyond minimal, but running something at 8000hz polling is only really beneficial with high refresh rates.A Stoner said:Refresh rates are great up to a point, and I am pretty certain that we are well beyond the point of not only diminishing returns, but beyond the point of any returns if we are past 300 hz.
Actually, my concern is if these 500hz panels will be using DP 2.0 with a UHBR 20 certification. DSC sucks if you tab out of your games a lot, and it's basically required at these speeds (and with anything NVidia ATM). -
DougMcC
There are pretty credible sources for there being value in responses up to about 2khz. My guess is that monitor refresh rates will top out at 2400hz because that will sync with all the interesting lower multiples.A Stoner said:Refresh rates are great up to a point, and I am pretty certain that we are well beyond the point of not only diminishing returns, but beyond the point of any returns if we are past 300 hz. -
thestryker The high refresh rates are a pretty big deal and they should keep getting higher for a while. This is where frame generation technology should really shine.Reply
The real issue is the hoops required connectivity wise to drive those refresh rates. UHBR20 is mandatory for 1440p/500Hz native (same as 4k/240) with or without HDR.
I'm curious about the smart monitor mentioned here, because that honestly sounds like the same scaler technology used in their smart TVs. I would love for a reviewer to do a test of a monitor/TV with a good scaler to see if it looks as good as/better running native 1440p upscaled to 4k rather than FSR/DLSS/XeSS. -
A Stoner
I'd be interested in study information on this. I am not even slightly aware of any method of testing this. I would assume that it would be direct eye motion sensing or some kind of direct brain scanning. No human moves fast enough or consistently enough that you can measure hand-eye response down to .0005 seconds.DougMcC said:There are pretty credible sources for there being value in responses up to about 2khz. My guess is that monitor refresh rates will top out at 2400hz because that will sync with all the interesting lower multiples.
I have my doubts that any human could really pick up the difference between getting one frame every .002 seconds vs every .001 seconds. I do understand that the point of getting information to executing on that information is real, but I just do not think reactions happen consistently enough that you could really measure response time from a human accurately enough to even tell the difference. I can be persuaded otherwise. -
thestryker
I don't believe it's about the human side of the equation but rather overcoming technological issues. This is an old article, but it's the first one I remember talking about very high refresh rates. It might be out of date a bit by now, but it was easy for me to find: https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/A Stoner said:I'd be interested in study information on this. I am not even slightly aware of any method of testing this. I would assume that it would be direct eye motion sensing or some kind of direct brain scanning. No human moves fast enough or consistently enough that you can measure hand-eye response down to .0005 seconds.
I have my doubts that any human could really pick up the difference between getting one frame every .002 seconds vs every .001 seconds. I do understand that the point of getting information to executing on that information is real, but I just do not think reactions happen consistently enough that you could really measure response time from a human accurately enough to even tell the difference. I can be persuaded otherwise. -
JRStern I think that 60hz is already beyond human perception, but there are secondary effects of transition that can affect the quality of the image, except these may get worse not better with speed - you end up in transition mode more than viewing mode.Reply
And if you're playing realtime games and are crazy about millisecond latency, well, there you go, just maybe a couple of milliseconds faster and you can beat the competition, human or robot, it's highly unlikely but it is possible.
So a screen capable of 240hz may be a better screen even if you run it at 60hz, etc. -
DougMcC
https://www.nature.com/articles/srep07861https://en.wikipedia.org/wiki/Flicker_fusion_threshold#:~:text=In%20some%20cases%2C%20it%20is,the%20%22phantom%20array%22%20effect.A Stoner said:I'd be interested in study information on this. I am not even slightly aware of any method of testing this. I would assume that it would be direct eye motion sensing or some kind of direct brain scanning. No human moves fast enough or consistently enough that you can measure hand-eye response down to .0005 seconds.
I have my doubts that any human could really pick up the difference between getting one frame every .002 seconds vs every .001 seconds. I do understand that the point of getting information to executing on that information is real, but I just do not think reactions happen consistently enough that you could really measure response time from a human accurately enough to even tell the difference. I can be persuaded otherwise. -
TJ Hooker
I'm not sure your links are super relevant. They are more related to display brightness modulation, and how fast it has to be before you no longer see flicker and instead perceive a fixed brightness/shade of gray. And display technologies like stereoscopic 3D, where the display is rapidly switching between two different images and your brain has to composite them into one. Neither of these are really applicable to normal LCD/OLED displays (although they may be relevant if you're using backlight strobing/BFI).DougMcC said:https://www.nature.com/articles/srep07861https://en.wikipedia.org/wiki/Flicker_fusion_threshold#:~:text=In%20some%20cases%2C%20it%20is,the%20%22phantom%20array%22%20effect.
But for typical media, as per your first source: "Traditional TVs show a sequence of images, each of which looks almost like the one just before it and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays."
The "relatively low critical flicker fusion rate" being referred to is described in the article as 50-90 Hz.