Why you can trust Tom's Hardware
Our HDR benchmarking uses Portrait Displays’ Calman software. To learn about our HDR testing, see our breakdown of how we test PC monitors.
Exceptional HDR doesn’t usually come cheap. Only a full-array zone-dimming monitor can truly maximize the standard’s potential. Though I’ve tested screens with more zones than the U27M90’s 96, it holds its own against some very expensive displays.
HDR Brightness and Contrast
Sony claims DisplayHDR 600 compliance for the U27M90, but I measured a full-field white pattern at over 850 nits. Only the X27 is brighter, and it costs more than double. To measure the black level, I had to display a small info bug to activate a dimming zone near the light meter. Full black zones shut off their backlight LED entirely. Only the Acer’s 384 zones rendered more contrast than the Sony. And no other $900 4K/HDR monitor competes with the U27M90. As far as HDR goes, this monitor delivers a lot for the money.
Grayscale, EOTF and Color
The U27M90 locks out its image controls in HDR mode like most monitors of this type. However, it’s not a problem because grayscale tracking is near-perfect, as is the EOTF tracking. There’s a blue tint barely visible in the brightest highlights, but you’ll be hard-pressed to see that. The EOTF transitions a little early to tone mapping but that too is a minor error. If you connect a PS5, it will use a dynamic tone-mapping feature that enhances HDR10 content further with more highlight detail. This won’t work with PC games, but it’s available in many PS5 titles.
In the gamut test, the U27M90 tracked DCI-P3 with a little over-saturation. Only green comes up short, but all colors are on or close to their hue targets and saturation tracks linearly, meaning that detail is clear at all brightness levels and in all colors. Rec.2020 tracks well until the display runs out of color. At red and green levels over 80%, hue is altered to help punch up the most saturated shades. This is excellent performance
Christian Eberle is a Contributing Editor for Tom's Hardware US. He's a veteran reviewer of A/V equipment, specializing in monitors. Christian began his obsession with tech when he built his first PC in 1991, a 286 running DOS 3.0 at a blazing 12MHz. In 2006, he undertook training from the Imaging Science Foundation in video calibration and testing and thus started a passion for precise imaging that persists to this day. He is also a professional musician with a degree from the New England Conservatory as a classical bassoonist which he used to good effect as a performer with the West Point Army Band from 1987 to 2013. He enjoys watching movies and listening to high-end audio in his custom-built home theater and can be seen riding trails near his home on a race-ready ICE VTX recumbent trike. Christian enjoys the endless summer in Florida where he lives with his wife and Chihuahua and plays with orchestras around the state.
Intel tempers expectations for next-gen Falcon Shores AI GPU — Gaudi 3 missed AI wave, Falcon will require fast iterations to be competitive
Minisforum's AM5 mini-PC gets Ryzen 9 9950X upgrade for $919 — adding 64GB RAM and 2TB SSD pushes the price tag to $1,199
Nvidia revives LAN party after 13 years to celebrate RTX 50-series GPU launch — GeForce LAN 50 is a 50-hour LAN party across four different cities
-
waltc3 Here's another one that is fairly unimpressive with HDR nits performance. My 4k Phillips provides ~700 nits SDR--certified 1000 nits HDR--actually supports three separate HDR modes. My last BenQ, 4k provided ~300 nits SDR and about 320 nits HDR (was not certified like this Sony is not certified) and the difference is night and day. Amazing when I consider I paid less for the Phillips than this Sony is retailing for, and the Phillips is a much bigger monitor...!Reply
I also had several Sony CRTs...great monitors--remember my last--a 20" "flat-screened" Trinitron that supported my Voodoo3's 1600x1200 res ROOB....;) As an aside, the ATi fury I bought at the time to test--(the original ATi Fury, not AMD's) would not do 1600x1200 stock! I had call ATi and ask them about it and one of the driver programmers I spoke with (in those days you could dial up practically anyone and actually talk to them!) asked me why I wanted to run at 1600x1200...;) I had to actually add the simple instructions into their driver structure at the time to enable 1600x1200--'cause my Trinitron supported it and I wanted to use it!...;)
Sony made great monitors in those days--they were good enough for me and x86 in those years. The Trinitron brand is well known even today, as you mentioned. Originally, it was the Trinitron TV brand. I'm sure this monitor is a good one, I'm just not enamored of the specs. Those high nits make all the difference, in the display, imo. -
anonymousdude Makaveli said:Do people fine 27 inch and 4k usable I tried it and found 32 inch to be much better.
For gaming it's fine. Productivity not so much. -
Soul_keeper Makaveli said:Do people fine 27 inch and 4k usable I tried it and found 32 inch to be much better.
It really is personal preference. I've been using a Samsung U24E590D 23.6" 4K display for a few years now.
Personally I wanted maximum pixel density, good power usage, not too bulky.
Now I think my eyes aren't as good as they once were, I might get a 27" 4K in the future, maybe something like the one reviewed.
27" could be the "sweet spot" for 4K. And greater than 60HZ refresh is a plus.
Also your distance from the screen and usage style play a big role in the decision.
I lean forward and have my face 1' from the screen to read things for example. -
wr3zzz LOL $1000 27" monitor for "PC" gaming, Sony won't even bother supporting adaptive sync for its TV until this year.Reply -
edzieba In the 1990s, Sony marketed a line of Trinitron CRT screens. Their main draw was that they only curved on the vertical axis, which meant they were the closest thing to a flat-screen you could buy at the time.
The main draw of Trinitron tubes was their fine phosphor pitch, aligned phosphor grid (matters more for bitmapped graphics and characters than for TV where it produced crisper horizontal and vertical lines) , and lack of shadow masks producing an overall brighter image. Trinitron tubes were available as the normal 'bi curved' surface and as completely flat glass front tubes - as were shadow-mask tubes - so tube curvature was not a deciding factor in their preference. -
hotaru251 4k at 27inches is legit pointless.Reply
waste of energy to power(which costs more in pwoer bill), generates more heat (not what msot ppl want outside of the winter), and lowers frame rate for a near non discernible image quality.
1440p @ 240+ refresh rate would of been a MUCH more interesting product. -
Blacksad999
That was my thought, also. This is a pretty tough sell at it's price point. It should be $200-300 cheaper, realistically. Otherwise, there are significantly better monitors for the price. Hell, you can get a 48" LG C1 right now for less than this thing, and it's 4k, has amazing HDR, and 120hz.wr3zzz said:LOL $1000 27" monitor for "PC" gaming, Sony won't even bother supporting adaptive sync for its TV until this year. -
gg83 Why isn't Sony using they're OLED tech for gaming monitors? Is it difficult to make OLEDs smaller than 55 inches?Reply -
husker Aside from the resolution to size ratio, they made a huge mistake with the stand. The way the front center leg extends in front of the screen is going to cause some problems and is just an awful design decision. The stand should be functional and out of the way, not sticking out in your face as if to say, "Look at me! I kinda look like a PS5 but I'm just the leg of your stand! Sorry if I'm in the way! Buy a PS5!". Also, apparently computer stands don't know how to judiciously use exclamation marks.Reply