I’ve categorized HDR monitors into three tiers that usually, though not always, follow their price points. The least expensive are compatible with HDR signals but don’t add any contrast. The middle tier adds dynamic contrast with either zone-dimming edge backlights (8-16 zones) or backlight modulation.
The best HDR monitors have full-array local-dimming backlights. At the tiptop are Mini LED screens with 1152 zones and eye-watering price tags of $3,000 or more. The Sony Inzone U27M90 is something of an anomaly. It delivers a FALD backlight for $899. That’s pretty awesome.
Gaming performance operates flawlessly with either FreeSync or G-Sync systems (see FreeSync vs G-Sync to decide) and works with the VRR feature of the PS5. It will run at 144 Hz over DisplayPort 1.4 or 120 Hz over the HDMI 2.1 inputs. It also incorporates a KVM feature for users running a console and a PC. As a cover-it-all display, it excels.
Picture quality is the star here for sure. Though a 96-zone backlight doesn’t sound as impressive as a higher number, the image looks fantastic in practice. I compared it side by side with an Acer X27 and found little difference between them. The U27M90 has rich and vibrant color and deep contrast with a very well-engineered dynamic contrast feature. And you can use that local dimming option to enhance SDR content too. That truly sets this monitor apart.
Sony has been out of the monitor game for some time, but it didn’t pull any punches with this new display. The Inzone U27M90 is a superb 4K gaming monitor for PC or console play. If you’ve coveted a FALD monitor but couldn’t afford the price of entry, Sony may have just solved your problem. Gamers looking for resolution, contrast, color and performance at a reasonable price should definitely check it out.
I also had several Sony CRTs...great monitors--remember my last--a 20" "flat-screened" Trinitron that supported my Voodoo3's 1600x1200 res ROOB....;) As an aside, the ATi fury I bought at the time to test--(the original ATi Fury, not AMD's) would not do 1600x1200 stock! I had call ATi and ask them about it and one of the driver programmers I spoke with (in those days you could dial up practically anyone and actually talk to them!) asked me why I wanted to run at 1600x1200...;) I had to actually add the simple instructions into their driver structure at the time to enable 1600x1200--'cause my Trinitron supported it and I wanted to use it!...;)
Sony made great monitors in those days--they were good enough for me and x86 in those years. The Trinitron brand is well known even today, as you mentioned. Originally, it was the Trinitron TV brand. I'm sure this monitor is a good one, I'm just not enamored of the specs. Those high nits make all the difference, in the display, imo.
For gaming it's fine. Productivity not so much.
It really is personal preference. I've been using a Samsung U24E590D 23.6" 4K display for a few years now.
Personally I wanted maximum pixel density, good power usage, not too bulky.
Now I think my eyes aren't as good as they once were, I might get a 27" 4K in the future, maybe something like the one reviewed.
27" could be the "sweet spot" for 4K. And greater than 60HZ refresh is a plus.
Also your distance from the screen and usage style play a big role in the decision.
I lean forward and have my face 1' from the screen to read things for example.
waste of energy to power(which costs more in pwoer bill), generates more heat (not what msot ppl want outside of the winter), and lowers frame rate for a near non discernible image quality.
1440p @ 240+ refresh rate would of been a MUCH more interesting product.