Gaming At 3840x2160: Is Your PC Ready For A 4K Display?
We got our hands on Asus' PQ321Q Ultra HD display with a resolution of 3840x2160. Anxious to game on it, we pulled out our GeForce GTX Titan, 780, and 770 cards for a high-quality romp through seven of our favorite titles. What do you need to game at 4K?
4K Gaming Is Here And Possible, But Are You Willing To Pay For It?
When I first started reading stories about Ultra HD gaming, I couldn’t wait to get my hands on a screen—even if it was one of those $700 models with one HDMI input and a 30 Hz limit. Then there was Asus’ 60 Hz monitor with its $3500 price tag. I know better than to deride the cost of cutting-edge hardware, so if the PQ321Q worked as advertised, I knew there’d be enthusiasts willing to buy it. But as with any new piece of technology, growing pains had to be overcome.
And they’re still being battled. Nvidia’s drivers have clearly come a long way, particularly with regard to DisplayID and getting Surround mode enabled automatically for easier setup. Asus is making the necessary adjustments in its firmware as well. We did run into some issues getting the latest beta driver installed, incorrectly-set resolutions, and intermittent screen flashing. However, I suspect a lot of that was caused by the DVI splitter inserted for FCAT testing. Switching it out for a single DisplayPort cable solved two of those three problems.
I’ll leave AMD out of this, except to say that the company is targeting the end of this year for its phase-two frame pacing driver, which should introduce Eyefinity, DirectX 9, and OpenGL support. Even if it’s relatively easy to get the PQ321Q configured on a Radeon card right now, spending $3500 on Asus’ monitor, only to drop a bunch of frames in a CrossFire-based configuration, doesn’t make sense. Stay tuned, though—we’re promised more from AMD very soon, and we're counting on this situation improving.
Perhaps the company’s position isn’t really troublesome (in a practical sense) after all, though. To get an idea of who’s buying 4K monitors right now, I had a conversation with Kelt Reeves over at Falcon Northwest, who let me know that nobody is—at least not from Falcon. Naturally, Kelt wants this technology to take off. You just saw that it clearly requires potent hardware, and Falcon is in the business of selling high-end systems. He agrees with me that two GeForce GTX 780s are pretty much the entry point for gaming at 3840x2160. But he’s been testing the PQ321Q for two months (using newer firmware than I have, even), and still isn’t comfortable enough with the outstanding bugs to offer his customers Ultra HD. Although 4K might become an option in the future, support as it exists today is still being treated as beta by Falcon Northwest. Early adopters have their warning.
In the future, we’ll see single-scalar 4K displays at 60 Hz, though it’s probable that tiled panels carry forward for some time. Monitor and graphics card companies consequently need to work out how to get this technology polished. You simply cannot have a monitor that reports itself capable of 20 different resolutions, but then crops them down rather than scaling.
These devices have only been around for a couple of months though. Give them time. The smart play is to hold off on Ultra HD for now. But if you have a friend with more money than patience who can’t help himself, definitely spend as much time as possible gaming at his place. Sitting in front of 3840x2160 will absolutely wreck 1920x1080 for you—even if you’re used to playing across three screens.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Current page: 4K Gaming Is Here And Possible, But Are You Willing To Pay For It?
Prev Page Results: Tomb Raider-
RascallyWeasel Is it really necessary to use Anti Alaising at this resolution? If anything it would only hurt average FPS without really giving much of a visual increase.Reply -
ubercake Great review! It's good to see this information available.Reply
I know you want to leave AMD out of it since they still haven't completed the fixing of the runt/drop microstutter issue through promised driver updates (actually, I thought it was all supposed to be done with the July 31 update?), but people constantly argue that AMD cards would be superior because of this or that on 4K. Maybe after they release the new flagship?
At any rate, I won't buy a 4K 60Hz screen until the price drops under the $1K mark. I really wish they could make the higher res monitors with a faster refresh rate like 120Hz or 144Hz, but that doesn't seem to be the goal. There must be more money in higher res than in higher refresh. It makes sense, but when they drop the refresh down to 30Hz, it seems like too much of a compromise. -
CaedenV Hey Chris!Reply
So 2GB of ram on the 770 was not enough for quite a few games... but just how much vRAM is enough? By chance did you peak at the usage on the other cards?
With next gen consoles having access to absolutely enormous amounts of memory on dedicated hardware for 1080p screens I am very curious to see how much memory is going to be needed for gaming PCs running these same games at 4K. I still think that 8GB of system memory will be adequate, but we are going to start to need 4+GB of vRAM just at the 1080p level soon enough, which is kinda ridiculous.
Anywho, great article! Can't wait for 4K gaming to go mainstream over the next 5 years! -
shikamaru31789 So it's going to be a few years and a few graphics card generations before we see 4k gaming become the standard, something that can be done on a single mid-high end video card. By that time, the price of 4k tv's/montors should have dropped to an affordable point as well.Reply -
Cataclysm_ZA So no-one figures that benching a 4K monitor at lower settings with weaker GPUs would be a good feature and reference for anyone who wants to invest in one soon, but doesn't have anything stronger than a GTX770? Gees, finding that kind of information is proving difficult.Reply -
cypeq Cool Yet I can't stop to think that I can Put 5 000$ on something better than gaming rig that can run smoothly this 3 500 $ screen.Reply -
CaedenV 11564184 said:Is it really necessary to use Anti Alaising at this resolution? If anything it would only hurt average FPS without really giving much of a visual increase.
This is something I am curious about as well. Anandtech did a neat review a few months ago and in it they compared the different AA settings and found that while there was a noticeable improvement at 2x, things quickly became unnecessary after that... but that is on a 31" screen. I don't know about others, but I am hoping to (eventually) replace my monitor with a 4K TV in the 42-50" range, and I wonder with the larger pixels if a higher AA would be needed or not for a screen that size compared to the smaller screens (though I sit quite a bit further from my screen than most people do, so maybe it would be a wash?).
With all of the crap math out on the internet, it would be very nice for someone at Tom's to do a real 4K review to shed some real testable facts on the matter. What can the human eye technically see? What are UI scaling options are needed? etc. 4K is a very important as it holds real promise to being a sort of end to resolution improvements for entertainment in the home. there is a chance for 6K to make an appearance down the road, but once you get up to 8K you start having physical dimension issues of getting the screen through the doors of a normal house on a TV, and on a computer monitor you are talking about a true IMAX experience which could be had much cheaper with a future headset. Anywho, maybe once a few 4K TVs and monitors get out on the market we can have a sort of round-up or buyer's guide to set things straight? -
daglesj So those of us married, living with a partner or not still living with our parents need not apply then?Reply
I think there is a gap in the market for a enthusiast PC website that caters to those who live in the real world with real life budgets.