Sign in with
Sign up | Sign in
Your question
Closed

Gaming At 3840x2160: Is Your PC Ready For A 4K Display?

Last response: in Reviews comments
Share
September 19, 2013 6:12:40 AM

Is it really necessary to use Anti Alaising at this resolution? If anything it would only hurt average FPS without really giving much of a visual increase.
Score
39
September 19, 2013 6:16:04 AM

Would of enjoyed seeing the 79xx series take a crack at this.
Score
4
Related resources
September 19, 2013 6:22:51 AM

Yep, now hold on while I go order my SLI Titans. Anyone got $2K I can borrow??
Score
26
September 19, 2013 6:30:22 AM

Great review! It's good to see this information available.

I know you want to leave AMD out of it since they still haven't completed the fixing of the runt/drop microstutter issue through promised driver updates (actually, I thought it was all supposed to be done with the July 31 update?), but people constantly argue that AMD cards would be superior because of this or that on 4K. Maybe after they release the new flagship?

At any rate, I won't buy a 4K 60Hz screen until the price drops under the $1K mark. I really wish they could make the higher res monitors with a faster refresh rate like 120Hz or 144Hz, but that doesn't seem to be the goal. There must be more money in higher res than in higher refresh. It makes sense, but when they drop the refresh down to 30Hz, it seems like too much of a compromise.
Score
8
September 19, 2013 6:40:04 AM

Hey Chris!
So 2GB of ram on the 770 was not enough for quite a few games... but just how much vRAM is enough? By chance did you peak at the usage on the other cards?

With next gen consoles having access to absolutely enormous amounts of memory on dedicated hardware for 1080p screens I am very curious to see how much memory is going to be needed for gaming PCs running these same games at 4K. I still think that 8GB of system memory will be adequate, but we are going to start to need 4+GB of vRAM just at the 1080p level soon enough, which is kinda ridiculous.

Anywho, great article! Can't wait for 4K gaming to go mainstream over the next 5 years!
Score
5
September 19, 2013 6:40:31 AM

So it's going to be a few years and a few graphics card generations before we see 4k gaming become the standard, something that can be done on a single mid-high end video card. By that time, the price of 4k tv's/montors should have dropped to an affordable point as well.
Score
8
September 19, 2013 6:54:23 AM

So no-one figures that benching a 4K monitor at lower settings with weaker GPUs would be a good feature and reference for anyone who wants to invest in one soon, but doesn't have anything stronger than a GTX770? Gees, finding that kind of information is proving difficult.
Score
4
September 19, 2013 6:57:52 AM

Cool Yet I can't stop to think that I can Put 5 000$ on something better than gaming rig that can run smoothly this 3 500 $ screen.
Score
0
September 19, 2013 7:02:41 AM

RascallyWeasel said:
Is it really necessary to use Anti Alaising at this resolution? If anything it would only hurt average FPS without really giving much of a visual increase.


This is something I am curious about as well. Anandtech did a neat review a few months ago and in it they compared the different AA settings and found that while there was a noticeable improvement at 2x, things quickly became unnecessary after that... but that is on a 31" screen. I don't know about others, but I am hoping to (eventually) replace my monitor with a 4K TV in the 42-50" range, and I wonder with the larger pixels if a higher AA would be needed or not for a screen that size compared to the smaller screens (though I sit quite a bit further from my screen than most people do, so maybe it would be a wash?).

With all of the crap math out on the internet, it would be very nice for someone at Tom's to do a real 4K review to shed some real testable facts on the matter. What can the human eye technically see? What are UI scaling options are needed? etc. 4K is a very important as it holds real promise to being a sort of end to resolution improvements for entertainment in the home. there is a chance for 6K to make an appearance down the road, but once you get up to 8K you start having physical dimension issues of getting the screen through the doors of a normal house on a TV, and on a computer monitor you are talking about a true IMAX experience which could be had much cheaper with a future headset. Anywho, maybe once a few 4K TVs and monitors get out on the market we can have a sort of round-up or buyer's guide to set things straight?
Score
4
September 19, 2013 7:13:33 AM

So those of us married, living with a partner or not still living with our parents need not apply then?

I think there is a gap in the market for a enthusiast PC website that caters to those who live in the real world with real life budgets.
Score
0
September 19, 2013 7:25:35 AM


Just curious Chris, with the CPU not oc'd, are you sure there are no CPU
bottlenecks going on anywhere? Wondering whether an oc'd 4960X (as
I'm sure most who'd buy that chip would do) could help in any of the test
scenarios, inparticular Crysis3, though I see you do highlight Skyrim as
being one test that's platform-bound.

Ian.

Score
1
September 19, 2013 7:36:26 AM

Cataclysm_ZA said:
So no-one figures that benching a 4K monitor at lower settings with weaker GPUs would be a good feature and reference for anyone who wants to invest in one soon, but doesn't have anything stronger than a GTX770? Gees, finding that kind of information is proving difficult.


There are a few reasons:
1) If you can afford a $3,000 TV then you ought to be able to afford a decent GPU or two, making your argument seem kinda silly.

2) More resolution makes detail MUCH more important. If you have an image that is (pulls number from ass) 100x100 pixels then that image will always look it's best at that native 100x100 resolution. You can take that image and display it at a lower resolution (say 50x50 pixels) because you are displaying less information than is in the source material. But there is only so much that can be done to display that image at a higher resolution than the source (say 200x200 pixels). You can stretch things out and use AF on it, but at the end of the day you end up with a texture that looks flat, chunky, and out of place.
We are playing games today that are either console ports aimed at 720p, or native PC games aimed at 1080p. Nither of these are anywhere near 4K resolution, and so an 'ultra' setting for any game out today designed around these resolutions is really a 'basic' setting for what a 4K TV is really capable of. The true 'ultra' test is simply not possible until we get some much larger texture packs designed with 4K in mind.

3) While some performance can be gained back by dropping a bit of AA and AF, the vast bulk of the performance requirement is dictated by the raw amount of vRAM required, and the sheer 8MP image you are making 30-60 times a second (compared to the 2MP image of a 1080p display).

4) Next gen consoles are right around the corner which will be loaded with tons of RAM. This ridiculous amount of ram is available because next gen games are going to have much higher resolution textures, and a wider variety of them. On top of that we are going to see a lot more 'clutter' in games to make environments much more unique. All of these objects are going to have their own textures and physics to calculate, which means that yet again that today's 'ultra' settings are simply the 'basic' setting of what is coming in just 1 year.


So if you want to do 4K gaming then you need to afford the monitor, a duel head GPU setup, and be prepared to replace that duel head GPU setup in a year or two when next gen games simply become far too much for today's GPU capabilities. However, you do not need this raw horsepower to run a desktop, or to watch 4K video as even today's onboard GPUs can handle those tasks just fine at 4K. But if you want to be on the bleeding edge, you are simply going to have to bleed a bit, or else be like the rest of us and wait another year (or three) when the price drops and the GPUs catch up.
Score
6
September 19, 2013 7:37:50 AM

Great article, but what I really want to know is...

How AWESOME was it playing BF3 or Crisis3 with dual Titans at 4k?? Is it better than 3x1080s in surround? How much so? It's like you just had a morning in a Ferrari Enzo at Laguna Seca and just showed us charts of max G's and velocity time variance. I want to know what it's like to drive that rig!

And get some hi-res packs plus ENB for running Skyrim already!!
Score
8
September 19, 2013 7:43:39 AM

why do I have to spent some money for lowing my fps?
Score
-2
September 19, 2013 8:00:59 AM

I guess I might take the plunge, maybe in a few weeks!! :D 
As far as AA goes, ya, I have a ZR30W and AA makes gaming more "comfortable" on the eyes. It's already 2560x1600, but with AA on, it's difference *can be seen.
Score
0
September 19, 2013 8:10:16 AM

until they put an HDMI2.0 port in both the displays and my GPU (or have the display support 4K at 60hz through Display Port), I am staying out of this 4K business.

cost aside, I'm not going to spend top dollar on something that essentially runs as synced split-screen, and requires some sort of SLI or Crossfire system to get playable rates. by the time GPU technology advances enough, we can probably get a better quality 4K OLED at the cost of that ASUS panel
Score
4
September 19, 2013 8:10:17 AM

Higher Res wont do much for gaming , Id rather have 5760x1080 ultra wide view angels with 120hz than a single 4k display ,
Score
-1
September 19, 2013 8:13:09 AM

mapesdhs said:

Just curious Chris, with the CPU not oc'd, are you sure there are no CPU
bottlenecks going on anywhere? Wondering whether an oc'd 4960X (as
I'm sure most who'd buy that chip would do) could help in any of the test
scenarios, inparticular Crysis3, though I see you do highlight Skyrim as
being one test that's platform-bound.

Ian.



Hey Ian,
I was expecting this to be graphics-limited across the board. Skyrim didn't quite surprise me. I would have thought Grid 2 would have been the next-most-likely to demonstrate a processor bottleneck. Crysis 3, particularly at those higher settings, seems less likely to be platform-bound. Great idea for a follow-up, though (same for the suggestion that we evaluate quality without AA to see if it's perceived as necessary with 8.3 MP--thanks for that one).
Score
3
September 19, 2013 8:14:31 AM

vmem said:
until they put an HDMI2.0 port in both the displays and my GPU (or have the display support 4K at 60hz through Display Port), I am staying out of this 4K business.

cost aside, I'm not going to spend top dollar on something that essentially runs as synced split-screen, and requires some sort of SLI or Crossfire system to get playable rates. by the time GPU technology advances enough, we can probably get a better quality 4K OLED at the cost of that ASUS panel


You'd be fine with DisplayPort, too. The limitation isn't the interface, it's the hardware inside the monitor.
Score
0
September 19, 2013 8:28:41 AM

RascallyWeasel said:
Would of enjoyed seeing the 79xx series take a crack at this.


We'll revisit the AMD cards as soon as the company has something new to talk about. The good news is that this should be soon!
Score
-1
September 19, 2013 8:35:16 AM

4k 120hz plz ;)  lol my card probably cant even run 4k @ 20fps
Score
-1
September 19, 2013 8:38:27 AM

No. It CANNOT run Crysis (@4K).
Score
0
September 19, 2013 8:50:21 AM

Hey Chris, is there a reason a similar article hasn't been written up for QHD? I really like the format of this one, and it would probably be a lot more helpful to a lot of your readers considering it's available to the average consumer with a lower budget in mind. We've seen some good monitor articles and a little bit of info here and there, but we haven't seen a broad spectrum graphics card showdown @ 1440p, and I, for one, and incredibly interested in this.
Score
1
September 19, 2013 9:14:44 AM

Even given the fact that I will never be able to dump $6000 into a new PC/Monitor setup I still find this article incredibly impressive for 2 reasons:

1) it shows how much room there still is for graphics to advance.

And (more importantly)

2) It shows undeniably how well the Nvidia Geforce GTX Titan scales in a dual card configuration. I've always heard that, while the theoretical gains of any dual card would be double performance, history has you realistically getting a 50% improvement at best from the second card. Amazingly thought, in 5 out of the 7 titles benchmarked at a level that is capable of stressing a GTX Titan, the actual benchmarks of the SLI configuration did show the frame rates almost exactly doubling those of the single card.
Score
0
September 19, 2013 9:16:20 AM

I am dying to hear more news about ASUS' 39" 4k VA panel that is supposed to come out early next year...

Can't wait to get to gaming at 4k. Even if it means turning some settings down on my 780s, resolution > settings any day in my book.

Also, what's with the weird scaling issues in Crysis 3? That is probably the game that needs SLI scaling the most, and that is some of the most crap scaling I've seen in years from SLI.
Score
0
September 19, 2013 9:20:09 AM

My dream monitor is still a 5:4 display which is exactly double my current 1280x1024 monitor, so a 2560x2048 display, at about 30". Sadly, most video cards max out at 2560x1600 per single panel, but that will change soon once 4K gets going. I'm really hoping 5:4 makes a big comeback.

Even a 1920x1536 display would be better than any 16:9 display.
2560x2048 displays actually do exist, but they''re all medical diagnostic monitors with several drawbacks (deal-killers) :
1. They only come in 19"-21" varieties.
2. They cost $3,000-$12,000
3. Only grayscale. FML.

The human eyesight visual angle is almost exactly 4:3, but 5:4 is the cleaner ratio being 1.25 even as opposed to 1.33333.... But if 4:3 and 5:4 are the more natural angles, how did 16:9 become so ubiquitous? Because of some arbitrary film ratio that came about by adding the sound portion to the reel thereby clipping the vertical space that should've been there. Curse you outdated film! Oh what could've been...
Score
0
September 19, 2013 9:22:56 AM

"4k" gaming with aa??? is this site serius......its barely useful for low ress monitors, get it into your head aa its not a quality setting..

to my main point against these "4k" monitors, input laag, and not to mention only 60hz,
Score
-1
September 19, 2013 9:33:03 AM

I'll wait till we get 4k 120hz single display monitors. Asus has a nice piece of kit for non gamers on their hands though.
Score
0
September 19, 2013 9:50:23 AM

It's really weird a single 7970 wasn't used. I have overclocked mine to the point that it trades blows with a GTX 780 (For half the price).

Hell an HD 7850 on medium settings would have been nice to so we could really see the usability of a mid-range card...
Score
-3
September 19, 2013 10:00:17 AM

4K gaming on PC will become mainstream the day single GPU cards can manage playable framerates at decent settings in AAA titles.
Score
0
September 19, 2013 10:15:49 AM

I would appreciate it if someone can clarify this issue for me. Are these games actually running at native 4K? Or are they just being upscaled to that resolution? Do developers really create 4k textures? How does this work?
I feel like when I play some really old game, from 2004 for example, it doesn’t look any different at 1080p than it does at 720p. I can understand you may no longer need anti-aliasing, but are you really seeing a higher level of detail? How is that possible?
Please explain.
Thanks!
Score
1
September 19, 2013 10:26:13 AM

I think the sweet spot here is going to be the Chinese 4K "TV's" once they get them taking 60Hz input. The Seiki 39" that does 30Hz is under $600 on sale. Asus is smoking crack if they think that $3500 price is going to hold up on a 31" screen when 39" 60Hz T'vs are available.
Score
0
September 19, 2013 10:30:32 AM

Good article as usual. However, I'd like to know why do you have the AA and such enabled when it is not needed at such resolution? Anandtech did a review and said it is not necessary to have them turned on and that is the way I would play if I had a 4K monitor. I am sure that 2x GTX 780 running at GPU clock of 1250 MHz by 7400 MHz on vRAM will be more than enough to get 60FPS without the need of AA and such.
Score
-1
September 19, 2013 10:33:41 AM

The reason 120-144Hz is not catching on is because we lack CPUs that can drive games that fast. Heck, no matter what resolution you run in Crysis 3 and Tomb Raider on Ultra, you still see dips to around 40fps here and there - on >4GHz i7s.
Score
-2
September 19, 2013 10:35:55 AM

Most people cant even buy a 2560x1440.
Score
0
September 19, 2013 1:08:05 PM

Strange you didn't let the 7970 and 7990 come play, would have been fun to see. All the same this is completely unrealistic and applies to no one except for those with copious amounts of money and nothing to spend it on.
Score
-4
September 19, 2013 1:09:20 PM

Cool article in it's own right tho, just a little rich for my blood lol.
Score
0
September 19, 2013 1:15:21 PM

i don't think i'll be getting 4k so soon!!
Score
0
September 19, 2013 1:21:12 PM

"But can it play Crysis at 4K?"
Score
-3
September 19, 2013 1:25:50 PM

New overused quote: But can it play Crysis in 4K?
Score
-3
September 19, 2013 2:11:48 PM

I agree on that the anti-aliasing option becomes academic once you hit 4K resolutions. Even at 5760x1080, I usually turn it off or put it on a minimal 2x on most games since I enjoy more a sharper image over a blended one.
Score
0
September 19, 2013 2:29:23 PM

6gb cards will need to be the new standard
Score
0
September 19, 2013 2:49:33 PM

Where can I get the updated PQ321Q Firmware? I have a PQ321Q but still have problems booting/POSTing with my monitor in 4k mode. I'm hoping a firmware update can fix it but I can't find it.
Score
0
September 19, 2013 3:23:20 PM

cangelini said:
... Crysis 3, particularly at those higher settings, seems less likely to be platform-bound. ...


What do you reckon then might be the cause of the lack of scaling with Crysis3?

Ian.



Score
0
September 19, 2013 3:24:15 PM

I can't see any pixels on my current displays. Chances are you can't either.

For 4k displays to make sense, I would have to get a display twice as large as I have now and sit just as close to it. Even if I could ever afford a 50"+ 4k monitor, I wouldn't want to put it on my desk two feet from my face. Now if I actually had to lower the refresh rate to 30 fps, games would look worse at any distance.

4k makes sense for advertising companies and display manufactures and that's about it.
Score
-2
September 19, 2013 3:24:18 PM

I can't see any pixels on my current displays. Chances are you can't either.

For 4k displays to make sense, I would have to get a display twice as large as I have now and sit just as close to it. Even if I could ever afford a 50"+ 4k monitor, I wouldn't want to put it on my desk two feet from my face. Now if I actually had to lower the refresh rate to 30 fps, games would look worse at any distance.

4k makes sense for advertising companies and display manufactures and that's about it.
Score
-2
September 19, 2013 3:26:07 PM

BigMack70 said:
Also, what's with the weird scaling issues in Crysis 3? That is probably the game that needs SLI scaling the most, and that is some of the most crap scaling I've seen in years from SLI.


It'll be interesting to see how Crysis3 scales with the AMD cards once the drivers
are sorted out.

Ian.

Score
1
September 19, 2013 3:54:46 PM

merikafyeah said:
Even a 1920x1536 display would be better than any 16:9 ...


Lots of CRTs can do native 2048x1536 (any based on the same 22" Sony Trinitron), and
a unit in decent condition will look nicer than most flat panels even now, especially any
TN model. Down side of course is such CRTs are large, heavy, use more power, are
generally no more than 22", and probably won't last that long after purchase (at best a
year or two), but they are dirt cheap if you want to try one, typically less than $130 fixed
price, much less via auction or whatever. Examples include the Dell P1130 and all its
Dell/HP variants, the SGI 5411 - there's at least a dozen which all use the same tube.
Those sold as "A-grade" are the only ones worth considering.

Having said that, one problem with such monitors is the stupid default Windows drivers
which prevent one from accessing the 2K mode without some .ini fiddling; very annoying.
This was easy to do with XP, but I've not yet worked out an equivalent fix for Win7.

I used to play games at 2048x1536 on a 22" CRT (Dell P1130). I waited ages for flat panels
to be IMO finally good enough to justify the switch without being annoyed at the drop in
resolution or fidelity, eventually replacing the CRT with an HP 24" H-IPS 1920x1200 LP2475W,
which does look very good. I do miss the extra pixel height sometimes (Oblivion looked
great on the old P1130), but I certainly don't miss the hefty desktop footprint, etc.


Customer demand determines what the market goes with, but such behaviour is often
self-reinforcing. It wasn't that long ago that 1200 monitors were pretty much the same
price and as easy to obtain as 1080 monitors, but then people just started going
with 1080p more and more, and pricing followed suit. For a while it was hard to find a
good 1200 display, though the influx of cheaper IPS models solved this, eg. the Dell
U2412M is quite nice and well priced. Same happened with 2560x1440 vs. 2560x1600;
the latter always cost more, but now the 1600 models cost massively more. I wanted to
get a 1600 model for benchmarking, but I couldn't justify the cost; I bought a Dell 1440
instead which does look ok. Thankfully, review articles now seem to be sticking more
with 1440 testing anyway.

Personally I'd rather the PQ321Q wasn't referred to as a 4K display, because it's not.
If it was 4096x2300, then that'd be fine. Perhaps using 3840 just makes the tech easier
to sort out, and no doubt it means the marketing is easier, it being exactly twice the
width & height of 1080p.

Ian.

PS. There are 24" CRTs such as the Sony FW900, but the weight is insane. Needs two
people to move one.

Score
2
!