Sign in with
Sign up | Sign in
Gaming At 3840x2160: Is Your PC Ready For A 4K Display?
By ,
1. What Does It Take To Game At 3840x2160?

Have you ever wondered whether fierce competition in the graphics hardware space could fuel such significant performance growth that we’d eventually see a bunch of high-end cards bottlenecked by games and platforms unable to fully utilize them?

Of course, taxing titles like Crysis 3, Metro: Last Light, and Arma 3 show us that developers are still very much pushing the envelope of PC gaming. Even the top thousand-dollar GPUs are quickly overwhelmed by any of those three games at their highest detail settings. Expanding out to three screens with AMD’s Eyefinity or Nvidia’s Surround technology almost necessitates a pair (or more) of potent GPUs. We’ve done plenty of lab testing at 5760x1080, and we have the hardware to push 7680x1440, should the need arise.

But there’s a growing interest in consolidating back down to a single screen using resolutions that far exceed the 2560x1440 currently available from popular QHD displays (Auria EQ276W 27" IPS Monitor Review: QHD For $400). Enter 4K UHD, sporting a native resolution of 3840x2160. Yeah, it's still 16:9. But aren't you glad to at least have an ultra-high resolution option on the PC?

Source: WikipediaSource: Wikipedia

As with any new video mode, content has to precede mass market acceptance. So you probably don’t have many friends with 4K TVs hanging in their living rooms. That isn’t a problem in the PC space, though. You can buy an UHD monitor, plop it down on your desk, and hammer away at Excel spreadsheets as if nothing changed. More relevant to today’s discussion, you’ll find yourself firing up Skyrim at 3840x2160 with your high-resolution texture pack installed, and lapping up luscious-looking visuals without the bother of bezels from a multi-screen array.

The real question then becomes: how much graphics hardware do you need to make 8.3 megapixels of surface area playable? And the answer, at least for most gamers, is going to be more than you currently have at your disposal.

The State Of 4K Ultra HD Gaming

You’re in the market for an Ultra HD display, but don’t know which one to buy. More pressing, you aren’t sure if your current PC is up to the task of driving its native resolution in your favorite titles.

The cheaper 4K TVs you’ve seen accept a single HDMI input and are limited to 30 Hz due to the bandwidth of that interface. They're available for as little as $700, in the case of Seiki Digital's SE39UY04. Asus’ $3500 PQ321Q is one of the only 4K screens capable of 60 Hz. But in order to achieve that refresh rate, you’re forced to run either one DisplayPort or two HDMI cables between your PC and the monitor.

What? Two HDMI cables?

The PQ321Q is a tiled display, meaning its 31.5” screen actually consists of two 1920x2160 panels, stitched together. So, utilizing two display outputs on your video card drives each HDMI port independently. Alternatively, you can use one DisplayPort 1.2-compatible output, which feeds into a multi-stream transport demultiplexer that spits out two data streams.

According to Nvidia, the company is using a couple of different technologies to help ensure you don’t see any tearing along the seam conjoining both panels. One, fliplock, forces each GPU to flip its frame buffer in sync. The other, scanlock, forces each GPU (and display output, or head) to display scanlines in sync. This stuff comes from the professional graphics world, where synchronization is necessary across multiple heads.

When the PQ321Q first came out, Nvidia’s drivers apparently weren’t prepared, according to reports by Ryan Shrout over at PC Perspective. The company worked on its software, resulting in the GeForce 327.19 release we have today. One improvement that came about is a change to the Extended Display Identification Data (EDID) structure that lets the monitor present itself as a tiled device to Nvidia’s driver (this gets rolled up into a standard called DisplayID v.1.3). The software uses that information to automatically configure Surround in a 2x1 configuration.

That’s not to say that, even after Nvidia's targeted update and new firmware from Asus, the PQ321Q works flawlessly. As Windows boots, you always see the splash screen squished into the left panel. And after a fresh Windows 8 installation, we were unable to apply the 327.19 driver without crashing. There was also a combination of flashing on the desktop and incorrect resolution settings when certain games started up. We encountered those inconveniences using two HDMI inputs and the DVI splitter needed for our FCAT-based testing, though. With the PQ321Q set to accept an MST stream and a DisplayPort cable linking PC to the monitor, our experience was notably better (albeit still not perfect). You’re still going to see Windows’ boot process happen on one of the two panels, and one might flicker on before the other when you start a game. But those are just artifacts of a tiled monitor. It was more odd that setting resolutions lower than 3840x2160 squished the desktop down, rather than scaling it up.

If you want to sit Ultra HD out until the display technology evolves to incorporate a single scaler, expect to wait a while before the controller hardware becomes available (it's not yet). That could be close to a year. And even then, tiled panels will likely persist. Guess we'd better figure out how to make this stuff work...

2. How Do We Benchmark Graphics At 4K Resolutions?

Benchmarking At 4K

Unfortunately, we can’t use the DisplayPort interface for our testing. The whole point of FCAT is recording video from the display output, which is then used by a combination of Perl scripts to analyze frame rates. As it stands, capturing video at 2560x1440 already maxes out the Datapath Ltd. card responsible for that task. By splitting the stream into two HDMI signals, however, creating a pair of 1920x2160 outputs, we keep the capture manageable.

The rest of what we do remains very similar to the FCAT-based analysis we adopted months ago. Even though only one panel’s output is memorialized in video, because the graphics subsystem is still rendering to two, the performance measured by our capture card still reflects the complete experience.

Today’s exploration only involves one GeForce GTX Titan, two GeForce GTX 770s, two GeForce GTX 780s, and two GeForce GTX Titans. This is by design.

AMD is now up to its Catalyst 13.10 beta driver, which incorporates the company’s frame pacing functionality that was so warmly received in Radeon HD 7990 Vs. GeForce GTX 690: The Crowd Picks A Winner. We also quantified the benefit of frame pacing in Dual-GPU Battle: Does Frame Pacing In Catalyst 13.8 Turn The Tide? The caveat was that AMD only supports this feature at resolutions up to 2560x1600. Eyefinity, which is needed by Asus’ tiled PQ321Q, isn’t yet enabled either. So, rather than publishing a bunch of graphs that remind everyone what a lot of dropped and runt frames look like, we’re simply omitting multi-GPU configurations in CrossFire for the time being. AMD is fully aware of the issues preventing a good experience at 3840x2160, and we’re now hoping to get our hands on its phase-two frame pacing driver before the end of the year. That’ll be the release expected to add support for higher resolutions, Eyefinity, DirectX 9, and OpenGL.

Test Hardware And Software

Test Hardware
Processors
Intel Core i7-4960X (Ivy Bridge-E) 3.6 GHz Base Clock Rate, 4 GHz Maximum Turbo Boost, LGA 2011, 15 MB Shared L3, Hyper-Threading enabled, Power-savings enabled
Motherboard
ASRock X79 Extreme5 (LGA 2011) X79 Express Chipset, BIOS 2.40
Memory
G.Skill 32 GB (8 x 4 GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 and 1.65 V
Hard Drive
Samsung 840 Pro SSD 256 GB SATA 6Gb/s
Graphics
Nvidia GeForce GTX Titan 6 GB

Nvidia GeForce GTX 780 3 GB

Nvidia GeForce GTX 770 2 GB
Power Supply
Corsair AX860i 860 W
System Software And Drivers
Operating System
Windows 8 Professional 64-bit
DirectX
DirectX 11
Graphics DriverNvidia GeForce Release 327.19
Benchmarks And Settings
Battlefield 3
Ultra Quality Preset, v-sync off, 3840x2160, DirectX 11, Going Hunting, 90-Second playback, FCAT
Arma 3
Ultra Detail Preset, DirectX 11, 2x FSAA, v-sync off, 3840x2160, Infantry Showcase, 30-Second playback, FCAT
Grid 2
Ultra Quality Preset, v-sync off, 3840x2160, Built-In Benchmark, FCAT
The Elder Scrolls V: Skyrim
Ultra Quality Preset, FXAA Disabled, 3840x2160, Custom Run-Through, 25-Second playback, FCAT
BioShock Infinite
High Quality Settings, DirectX 11, 3840x2160, Custom Built-in Benchmark Sequence, 75-Second playback FCAT
Crysis 3
High System Spec, SMAA MGPU (2x), High Texture Resolution, 3840x2160, Custom Run-Through, 60-Second Sequence, FCAT
Tomb Raider
Ultra Quality Preset, FXAA Enabled, 16x Anisotropic Filtering, TressFX Hair, 3840x2160, Custom Run-Through, 45-Second playback, FCAT
3. Results: Arma 3

Any time we’ve ever discussed realism in first-person shooters, the Arma series comes up. After a lengthy alpha and beta period, Arma 3 finally went live earlier this month.

If you want to play this one at its highest settings using a 4K display’s native 3840x2160 resolution, you’re probably going to want two GeForce GTX Titans. We can imagine that three GeForce GTX 770s or 780s would work as well, though two tend to fall under the average frame rates we want to see.

Indeed, the 770s spend some time under 30 FPS in our simple run-through sequence, while 780s flirt with the 35 FPS mark at a number of points. It takes a couple of Titans to keep up above 40 FPS for most of the benchmark.

Our frame time variance calculation gives us the difference between the time it takes to a display a frame compared to the average of the 20 frames before and after, deliberately minimizing the impact of variance as the frame rate goes up or down due to game loads (this is natural), and instead trying to identify problem areas.

To give you an idea of how important this calculation is, average variance across our Arma 3 run is 46 ms on a GeForce GTX Titan. But exclusively comparing each frame to the 20 frames before and after it drops that number to .65 ms.

It makes sense that we would see the lowest frame time variance (and hence, the most consistent frame delivery) from a single-GPU configuration. Indeed, GeForce GTX Titan shows up at the top of our chart. Dual-GPU setups appear in the order of their performance; slower cards would indeed be expected to perform less consistently, even between successive frames.

4. Results: Battlefield 3

Surprised to learn that a $1000 GeForce GTX Titan won’t get you consistently playable frame rates in Battlefield 3 using the Ultra quality preset? At least based on the averages, we’d be most inclined to go with a pair of 780s for slightly more. Two Titans won’t necessarily give you the return on your investment in this game.

This is but the first time you’ll see strange behavior from two GeForce GTX 770s in SLI. The GF104-based cards react to all of the same peaks and valleys that the other combinations convey—they just exaggerate them. In order to investigate, we fired up EVGA's Precision X and logged memory usage at this monster resolution and the demanding settings that go along with it. What we found is that certain titles need more than 2 GB of on-board memory, and it isn't difficult to freak the 770s out by going over.

That behavior aside, we see one Titan maintain more than 30 FPS throughout our single-player run, while two GeForce GTX 780s keep their noses above 50 FPS the whole time.

Erratic behavior translates into an outlier worst-case reading from two GTX 770s, though their average variance isn’t particularly worrying. The other three configurations perform really well on paper, though the huge 3840x2160 resolution makes it very easy to see any time the scripted sequence we test hitches.

5. Results: BioShock Infinite

Again, you can almost double your average frame rate at 3840x2160 by dropping in a second GeForce GTX Titan. Particularly at the High detail preset, this game is completely graphics-bound. Fortunately, it still runs really well on one GPU, yielding more than 60 FPS.

Granted, one Titan does drop under 60 FPS on occasion (though it’s much more consistent than the two GeForce GTX 770s we’re using, which again dip way down during certain passages of the built-in benchmark). Two GeForce GTX 780s or Titans are more than fast enough to maintain playable performance in BioShock Infinite.

Big spikes again factor into the 770’s average and worst-case variance results, owning to less on-board GDDR5 memory compared to the 3 GB GeForce GTX 780s and 6 GB Titan cards.

6. Results: Crysis 3

Our Crysis 3 benchmark is a manual run-through that’s typically pretty consistent. The first time we ran FCAT scripts on the output, though, it was clear that v-sync switched on in a couple of cases, even though it was off in the game. As a result, we ran several configurations with the option toggled off in Nvidia’s driver. The performance figures changed in response, but the average frame rates suggest something is still off in Crysis 3. Two GeForce GTX 770s should be faster than one Titan, and we were really hoping for more scaling from two $1000 cards.

Charting out frame rate over time shows the GeForce GTX 770s in SLI and Titan card trading blows, with the 780s and Titans in SLI behaving similarly as well.

Big spikes again affect the 770s, though average frame time variance continues to show Nvidia’s multi-GPU solutions yielding a fairly consistent experience.

What I will say is that, in a game like Crysis 3, when performance starts dipping under the 30 FPS range, your ability to react quickly falls off very fast. Because this game requires a manual run-through, I was painfully aware on both the single-Titan and 770 SLI systems that low frame rates were getting me shot more often, forcing me to restart my run. Two 780s and Titans helped this issue immensely.

7. Results: Grid 2

Even though we’re testing at an obscene resolution using the game’s Ultra quality preset, racing sims typically don’t push graphics hardware as far as shooters. One GeForce GTX Titan appears sufficient for smooth performance, while a pair of GeForce GTX 770s get you up above 56 FPS on average. Spending more on 780s or Titans doesn’t give you a great return on your investment in Grid 2.

We’d expect to see these spikey frame rates translate to higher variance readings on our next chart. For the time being, though, we can be relatively certain that all four configurations are playable.

And there are the larger variance numbers we were anticipating. The 780s and Titans in SLI demonstrate worst-case results in the 5+ ms range, though those are still really good compared to some of what we were seeing before FCAT became a popular tool.

8. Results: The Elder Scrolls V: Skyrim

Looks like it’s time to start installing add-ons, huh? Even at 3840x2160 using the Ultra quality preset, Skyrim is completely limited by the speed of our Ivy Bridge-E platform, even using two GeForce GTX 770s in SLI. One Titan, on the other hand, trails quite a ways back (though at nearly 70 FPS, it’s still plenty of graphics card for Ultra HD).

All three SLI setups track each other’s’ performance during our Skyrim benchmark.

The elegance of a single-GPU configuration shines through as very low frame time variance. Teaming a couple of cards together adds a little to this, but we’re still talking about sub-millisecond latencies on average.

9. Results: Tomb Raider

The impressive scaling returns in Tomb Raider, where two Titans nearly double the performance of one. Incidentally, we wouldn’t consider that single thousand-dollar card completely smooth in our benchmark sequence. A couple of GeForce GTX 780s, on the other hand, fare much better.

The data in these graphs reflects the average frame rates well, particularly since all four platforms manage fairly consistent performance across our test.

Those nice straight lines, devoid of jarring spikes, lend themselves to consistent frame rate delivery, and the variance we see comparing the frame time of any given frame to the 20 before and after it is tiny.

10. 4K Gaming Is Here And Possible, But Are You Willing To Pay For It?

When I first started reading stories about Ultra HD gaming, I couldn’t wait to get my hands on a screen—even if it was one of those $700 models with one HDMI input and a 30 Hz limit. Then there was Asus’ 60 Hz monitor with its $3500 price tag. I know better than to deride the cost of cutting-edge hardware, so if the PQ321Q worked as advertised, I knew there’d be enthusiasts willing to buy it. But as with any new piece of technology, growing pains had to be overcome.

And they’re still being battled. Nvidia’s drivers have clearly come a long way, particularly with regard to DisplayID and getting Surround mode enabled automatically for easier setup. Asus is making the necessary adjustments in its firmware as well. We did run into some issues getting the latest beta driver installed, incorrectly-set resolutions, and intermittent screen flashing. However, I suspect a lot of that was caused by the DVI splitter inserted for FCAT testing. Switching it out for a single DisplayPort cable solved two of those three problems.

This should give you an idea of resolution at 1920x1080This should give you an idea of resolution at 1920x1080

I’ll leave AMD out of this, except to say that the company is targeting the end of this year for its phase-two frame pacing driver, which should introduce Eyefinity, DirectX 9, and OpenGL support. Even if it’s relatively easy to get the PQ321Q configured on a Radeon card right now, spending $3500 on Asus’ monitor, only to drop a bunch of frames in a CrossFire-based configuration, doesn’t make sense. Stay tuned, though—we’re promised more from AMD very soon, and we're counting on this situation improving.

Perhaps the company’s position isn’t really troublesome (in a practical sense) after all, though. To get an idea of who’s buying 4K monitors right now, I had a conversation with Kelt Reeves over at Falcon Northwest, who let me know that nobody is—at least not from Falcon. Naturally, Kelt wants this technology to take off. You just saw that it clearly requires potent hardware, and Falcon is in the business of selling high-end systems. He agrees with me that two GeForce GTX 780s are pretty much the entry point for gaming at 3840x2160. But he’s been testing the PQ321Q for two months (using newer firmware than I have, even), and still isn’t comfortable enough with the outstanding bugs to offer his customers Ultra HD. Although 4K might become an option in the future, support as it exists today is still being treated as beta by Falcon Northwest. Early adopters have their warning.

And this is 3840x2160And this is 3840x2160

In the future, we’ll see single-scalar 4K displays at 60 Hz, though it’s probable that tiled panels carry forward for some time. Monitor and graphics card companies consequently need to work out how to get this technology polished. You simply cannot have a monitor that reports itself capable of 20 different resolutions, but then crops them down rather than scaling.

These devices have only been around for a couple of months though. Give them time. The smart play is to hold off on Ultra HD for now. But if you have a friend with more money than patience who can’t help himself, definitely spend as much time as possible gaming at his place. Sitting in front of 3840x2160 will absolutely wreck 1920x1080 for you—even if you’re used to playing across three screens.