For details on our color gamut testing and volume calculations, please click here.

Overall color accuracy is pretty good. We do see issues in the red and blue primaries, though. Red is under-saturated across the board, while blue becomes progressively more over-saturated as the level rises. It’s also a bit off in hue. The only compensation comes at the 100-percent point where blue’s luminance is lowered.

Calibration improves the results quite a bit. All of the luminance values are nearly spot-on and blue is still properly compensated at the 100-percent saturation point. Plus the hue errors are now almost non-existent for the secondary colors. To fix the remaining problems for blue and red would require a CMS. That’s not something one normally sees in a $500 monitor.
Now we return to the comparison group:

Color gamut accuracy is probably where calibration has the most impact. The average error goes from 2.93 to 1.83 Delta E. That may not seem like much, but if your application demands precision, it’s a worthwhile gain.
Gamut Volume: Adobe RGB 1998 And sRGB

The under-saturated red primary prevents the P2815Q from rendering 100 percent of the sRGB gamut. In most productivity or entertainment applications, this isn't a problem. However, photographers may want better performance in the gamut volume metric.
- Dell P2815Q 28-Inch Ultra HD Monitor Review
- Packaging, Physical Layout, And Accessories
- OSD Setup And Calibration Of The Dell P2815Q
- Results: Brightness And Contrast
- Results: Grayscale Tracking And Gamma Response
- Results: Color Gamut And Performance
- Results: Viewing Angles And Uniformity
- Dell P2815Q: 4K On The Cheap (With A Caveat)
So instead of buying a $430 monitor, you suggest people to buy a $2000+ TV. This is beyond stupid...
No I don't. I can always choose not to use the tech until they get it right, and if they never do, eh.. oh well!
High input lag makes this a particularly poor choice. Input lag impacts every task, not just gaming. Forget it.
Gamers are really in an "interesting" place this year. You can't get a video card to drive UHD even with the newest chips, and buying a monitor is a minefield. Sure, you can do SLI to get to UHD, that'll get you most of the way there... except certain games (AC), and immediately after any game's release (Titanfall), and sometimes you'll need lower settings to accommodate VRAM issues (Evil Within). This of course bodes poorly for games to be released in the upcoming year if you're buying now. It's the wait for proper support that's really disappointing (usually good support, but look at Titanfall and CoD Ghosts as long waits).
On the monitor side, you can go to 1440p, and watch as your tech is outdated quickly (as 4K/UHD gets its act together...maybe) - and be permanently stuck with a resolution that doesn't scale 1:1 with 1080p (again, hope you're running good GPUs). In all monitor tech, you can get low response times, or great colors, or take a risk on a foreign vendor's product that MIGHT be tricked into doing both but will still have some blur/ghosting. You can get Variable Refresh tech that'll work with one brand of GPU but not the other. Lightboost/ULMB or 3D support is up for consideration, but can't be used with AS/GSync.
I can't help but think it's all a gigantic mess right now.
-------
I wouldn't be caught dead with this useless monitor in the article. Either go for
- Quality UHD monitor: Dell IPS 32 inch quality, UP3214Q. 1400 usd isch.
- Cheap UHD but not junk: Asus 287 for 28-590 Samsung performance but with a much better stand. If wallmounting get the Samsung and save some cash. 500 usd isch.
- Quality Gaming: Asus 1440p 144Hz super gamer monitor. 1200 usd ish.
- Desktop real estate and best overall choice: Samsung UE50HU6900 for 8ms B2B UHD@60Hz over HDMi 2.0(Require 970/980). 750 usd isch.
I'd pick the TV.
Outdated quickly? PC display resolution takes about a decade to step up between mainstream standards.
Unless all you do with your PC is watch movies, not scaling 1:1 with 1080p is usually a "don't-care" item - people who are bothered by that would not buy into those sort of resolutions in the first place.
If my eyesight were perfect, I might be able to make use of 4K at 32" (or perhaps a little smaller), but the way mine is, 39" rocks!
What in the world are you talking about? The majority of households have only recently been running 1080p monitors (within the past few years), and the majority of gamers game on 1080p according to many gaming site polls, not QHD. It will be years before 1440p gets to be mainstream in households. They are still considered a luxury buy in the PC market and will be for some time. Further, when 1080p monitors were out after a couple of years, prices dropped sharply. That has not happened with QHD monitors outside of the cheap Korean Apple rejects.
It's going to be several years before I feel the need or even want to plunk down cash for not only a decent 4K monitor when they actually come out and are reasonably affordable (<$800US) but the GPU(s) to power it at decent frame rate numbers.
Your American pricing ("price convertion") for UHD TV is wrong. The cheapest Samsung 50" 4k is around $1300USD. 2nd tier brand 50" 4k is around $1000USD. They are definitely not as cheap as you think.
The advent of dirt-cheap 1080p screen relegated practically all other resolutions to niche markets so I seriously doubt QHD will ever become a significant mainstream resolution - the same way inexpensive 1080p practically wiped out 1200p.
About eight years ago, 1080p and 1200p were both available around $300 but today, 1080p is down to $100-150 while 1200p is still $300-500.
4k will be the next major mainstream resolution about five years from now.
No I don't. I can always choose not to use the tech until they get it right, and if they never do, eh.. oh well!
High input lag makes this a particularly poor choice. Input lag impacts every task, not just gaming. Forget it.
Gamers are really in an "interesting" place this year. You can't get a video card to drive UHD even with the newest chips, and buying a monitor is a minefield. Sure, you can do SLI to get to UHD, that'll get you most of the way there... except certain games (AC), and immediately after any game's release (Titanfall), and sometimes you'll need lower settings to accommodate VRAM issues (Evil Within). This of course bodes poorly for games to be released in the upcoming year if you're buying now. It's the wait for proper support that's really disappointing (usually good support, but look at Titanfall and CoD Ghosts as long waits).
On the monitor side, you can go to 1440p, and watch as your tech is outdated quickly (as 4K/UHD gets its act together...maybe) - and be permanently stuck with a resolution that doesn't scale 1:1 with 1080p (again, hope you're running good GPUs). In all monitor tech, you can get low response times, or great colors, or take a risk on a foreign vendor's product that MIGHT be tricked into doing both but will still have some blur/ghosting. You can get Variable Refresh tech that'll work with one brand of GPU but not the other. Lightboost/ULMB or 3D support is up for consideration, but can't be used with AS/GSync.
I can't help but think it's all a gigantic mess right now.
...um... no... im playing 290x xfire, on Samsung 4k, liquid cooling and all games including Titanfall, BF4, ESO, etc are max settings between 60fps - 90fps. only game that has issues is Watch Dogs and we all know why that is happening. Call it what you want, once you go 4k (done right) you know it is the true PC gamer master race!
The only $300 1440p displays that are "available" are Asian imports, many of which coming with vague (if any) performance guarantees. For people who want to stick to something officially sold in North America, prices start in the neighborhood of $500. Some of the cheaper 4k displays are getting close to that.
With 4k displays entering the $500-700 range, 1440p is going to get relegated exclusively to niche status and the price tag is going to rise due to low volume.
Hmmm. I'm not sure that's really the same comparison. I have both 1080p and 1200p monitors (two 24" 1080s, one 25.5" 1200). Both screens still share 1920 horizontal lines. The only difference is that a 1920x1200 monitor of course has a little more viewing height in display lines. Effectively otherwise to the eye they are the same resolution (same desktop screen icon sizes, no noticeable increase in game graphics resolution, etc.).
Just my opinion of course, but I think that's a completely different situation than moving up to a completely new eye candy world of 2560x1440. I still love my 25.5" Samsung's extra viewing height. It is hard to beat without moving up entirely to a new screen size and resolution, which is what I did with a Dell U2713H. With HDTVs being 1080p, it was only logical that LCD screen manufacturers focus on 1080p monitor screens for the mainstream markets. Simply put, 1920x1200 monitors were not manufactured in high capacity and hence the higher pricing. But you probably are right though...QHD will not be mainstream ever like 1080p.
Hell I almost regret going 1440p, only a handful of titles have put that to good use. For everything else, it just highlights how bad the texture res is. And yes I run the vast majority of games on Ultra presets.
Suggestion: try Sniper Elite 3, BF4, Thief and WatchDogs at 4k.
The point I was trying to make had absolutely nothing to do with "noticeable increase in resolution" but everything to do with which resolutions turn into commercial success - as in widely adopted and mass-manufactured mainstream resolution that ends up becoming the de-facto standard even for cheap displays.
When 1080p became widely accepted, 1080p display prices dropped like like rocks all the way down to $100 while 1200p displays remained at $300+ despite having only marginally higher resolution. Vanishing demand made production vanish and without mass manufacturing, unit costs remain high.
With 4k displays already starting to undercut 1440p before 1440p ever had a chance to reach mainstream-friendly price points, it looks like 4K is already set to win the race for next mainstream desktop resolution - by this time next year, 4K will probably be widely available for cheaper than most similar-quality 1440p.
I'll stick with my old IPS 60hz until 4K is IPS+60Hz and become affordable.
Additionally, developers have to start properly supporting this res with appropriate textures, etc - and not introduce compromises/cheating e.g. concentrating on only 'slow paced games' so they can force 30fps lock.