Sign in with
Sign up | Sign in

Gaming At 3840x2160: Is Your PC Ready For A 4K Display?

Gaming At 3840x2160: Is Your PC Ready For A 4K Display?
By

We got our hands on Asus' PQ321Q Ultra HD display with a resolution of 3840x2160. Anxious to game on it, we pulled out our GeForce GTX Titan, 780, and 770 cards for a high-quality romp through seven of our favorite titles. What do you need to game at 4K?

Have you ever wondered whether fierce competition in the graphics hardware space could fuel such significant performance growth that we’d eventually see a bunch of high-end cards bottlenecked by games and platforms unable to fully utilize them?

Of course, taxing titles like Crysis 3, Metro: Last Light, and Arma 3 show us that developers are still very much pushing the envelope of PC gaming. Even the top thousand-dollar GPUs are quickly overwhelmed by any of those three games at their highest detail settings. Expanding out to three screens with AMD’s Eyefinity or Nvidia’s Surround technology almost necessitates a pair (or more) of potent GPUs. We’ve done plenty of lab testing at 5760x1080, and we have the hardware to push 7680x1440, should the need arise.

But there’s a growing interest in consolidating back down to a single screen using resolutions that far exceed the 2560x1440 currently available from popular QHD displays (Auria EQ276W 27" IPS Monitor Review: QHD For $400). Enter 4K UHD, sporting a native resolution of 3840x2160. Yeah, it's still 16:9. But aren't you glad to at least have an ultra-high resolution option on the PC?

Source: WikipediaSource: Wikipedia

As with any new video mode, content has to precede mass market acceptance. So you probably don’t have many friends with 4K TVs hanging in their living rooms. That isn’t a problem in the PC space, though. You can buy an UHD monitor, plop it down on your desk, and hammer away at Excel spreadsheets as if nothing changed. More relevant to today’s discussion, you’ll find yourself firing up Skyrim at 3840x2160 with your high-resolution texture pack installed, and lapping up luscious-looking visuals without the bother of bezels from a multi-screen array.

The real question then becomes: how much graphics hardware do you need to make 8.3 megapixels of surface area playable? And the answer, at least for most gamers, is going to be more than you currently have at your disposal.

The State Of 4K Ultra HD Gaming

You’re in the market for an Ultra HD display, but don’t know which one to buy. More pressing, you aren’t sure if your current PC is up to the task of driving its native resolution in your favorite titles.

The cheaper 4K TVs you’ve seen accept a single HDMI input and are limited to 30 Hz due to the bandwidth of that interface. They're available for as little as $700, in the case of Seiki Digital's SE39UY04. Asus’ $3500 PQ321Q is one of the only 4K screens capable of 60 Hz. But in order to achieve that refresh rate, you’re forced to run either one DisplayPort or two HDMI cables between your PC and the monitor.

What? Two HDMI cables?

The PQ321Q is a tiled display, meaning its 31.5” screen actually consists of two 1920x2160 panels, stitched together. So, utilizing two display outputs on your video card drives each HDMI port independently. Alternatively, you can use one DisplayPort 1.2-compatible output, which feeds into a multi-stream transport demultiplexer that spits out two data streams.

According to Nvidia, the company is using a couple of different technologies to help ensure you don’t see any tearing along the seam conjoining both panels. One, fliplock, forces each GPU to flip its frame buffer in sync. The other, scanlock, forces each GPU (and display output, or head) to display scanlines in sync. This stuff comes from the professional graphics world, where synchronization is necessary across multiple heads.

When the PQ321Q first came out, Nvidia’s drivers apparently weren’t prepared, according to reports by Ryan Shrout over at PC Perspective. The company worked on its software, resulting in the GeForce 327.19 release we have today. One improvement that came about is a change to the Extended Display Identification Data (EDID) structure that lets the monitor present itself as a tiled device to Nvidia’s driver (this gets rolled up into a standard called DisplayID v.1.3). The software uses that information to automatically configure Surround in a 2x1 configuration.

That’s not to say that, even after Nvidia's targeted update and new firmware from Asus, the PQ321Q works flawlessly. As Windows boots, you always see the splash screen squished into the left panel. And after a fresh Windows 8 installation, we were unable to apply the 327.19 driver without crashing. There was also a combination of flashing on the desktop and incorrect resolution settings when certain games started up. We encountered those inconveniences using two HDMI inputs and the DVI splitter needed for our FCAT-based testing, though. With the PQ321Q set to accept an MST stream and a DisplayPort cable linking PC to the monitor, our experience was notably better (albeit still not perfect). You’re still going to see Windows’ boot process happen on one of the two panels, and one might flicker on before the other when you start a game. But those are just artifacts of a tiled monitor. It was more odd that setting resolutions lower than 3840x2160 squished the desktop down, rather than scaling it up.

If you want to sit Ultra HD out until the display technology evolves to incorporate a single scaler, expect to wait a while before the controller hardware becomes available (it's not yet). That could be close to a year. And even then, tiled panels will likely persist. Guess we'd better figure out how to make this stuff work...

Display 104 Comments.
This thread is closed for comments
Top Comments
  • 39 Hide
    RascallyWeasel , September 19, 2013 6:12 AM
    Is it really necessary to use Anti Alaising at this resolution? If anything it would only hurt average FPS without really giving much of a visual increase.
  • 25 Hide
    expl0itfinder , September 19, 2013 6:22 AM
    Yep, now hold on while I go order my SLI Titans. Anyone got $2K I can borrow??
Other Comments
  • 39 Hide
    RascallyWeasel , September 19, 2013 6:12 AM
    Is it really necessary to use Anti Alaising at this resolution? If anything it would only hurt average FPS without really giving much of a visual increase.
  • 4 Hide
    RascallyWeasel , September 19, 2013 6:16 AM
    Would of enjoyed seeing the 79xx series take a crack at this.
  • 25 Hide
    expl0itfinder , September 19, 2013 6:22 AM
    Yep, now hold on while I go order my SLI Titans. Anyone got $2K I can borrow??
  • 8 Hide
    ubercake , September 19, 2013 6:30 AM
    Great review! It's good to see this information available.

    I know you want to leave AMD out of it since they still haven't completed the fixing of the runt/drop microstutter issue through promised driver updates (actually, I thought it was all supposed to be done with the July 31 update?), but people constantly argue that AMD cards would be superior because of this or that on 4K. Maybe after they release the new flagship?

    At any rate, I won't buy a 4K 60Hz screen until the price drops under the $1K mark. I really wish they could make the higher res monitors with a faster refresh rate like 120Hz or 144Hz, but that doesn't seem to be the goal. There must be more money in higher res than in higher refresh. It makes sense, but when they drop the refresh down to 30Hz, it seems like too much of a compromise.
  • 5 Hide
    CaedenV , September 19, 2013 6:40 AM
    Hey Chris!
    So 2GB of ram on the 770 was not enough for quite a few games... but just how much vRAM is enough? By chance did you peak at the usage on the other cards?

    With next gen consoles having access to absolutely enormous amounts of memory on dedicated hardware for 1080p screens I am very curious to see how much memory is going to be needed for gaming PCs running these same games at 4K. I still think that 8GB of system memory will be adequate, but we are going to start to need 4+GB of vRAM just at the 1080p level soon enough, which is kinda ridiculous.

    Anywho, great article! Can't wait for 4K gaming to go mainstream over the next 5 years!
  • 8 Hide
    shikamaru31789 , September 19, 2013 6:40 AM
    So it's going to be a few years and a few graphics card generations before we see 4k gaming become the standard, something that can be done on a single mid-high end video card. By that time, the price of 4k tv's/montors should have dropped to an affordable point as well.
  • 4 Hide
    Cataclysm_ZA , September 19, 2013 6:54 AM
    So no-one figures that benching a 4K monitor at lower settings with weaker GPUs would be a good feature and reference for anyone who wants to invest in one soon, but doesn't have anything stronger than a GTX770? Gees, finding that kind of information is proving difficult.
  • 0 Hide
    cypeq , September 19, 2013 6:57 AM
    Cool Yet I can't stop to think that I can Put 5 000$ on something better than gaming rig that can run smoothly this 3 500 $ screen.
  • 4 Hide
    CaedenV , September 19, 2013 7:02 AM
    Quote:
    Is it really necessary to use Anti Alaising at this resolution? If anything it would only hurt average FPS without really giving much of a visual increase.


    This is something I am curious about as well. Anandtech did a neat review a few months ago and in it they compared the different AA settings and found that while there was a noticeable improvement at 2x, things quickly became unnecessary after that... but that is on a 31" screen. I don't know about others, but I am hoping to (eventually) replace my monitor with a 4K TV in the 42-50" range, and I wonder with the larger pixels if a higher AA would be needed or not for a screen that size compared to the smaller screens (though I sit quite a bit further from my screen than most people do, so maybe it would be a wash?).

    With all of the crap math out on the internet, it would be very nice for someone at Tom's to do a real 4K review to shed some real testable facts on the matter. What can the human eye technically see? What are UI scaling options are needed? etc. 4K is a very important as it holds real promise to being a sort of end to resolution improvements for entertainment in the home. there is a chance for 6K to make an appearance down the road, but once you get up to 8K you start having physical dimension issues of getting the screen through the doors of a normal house on a TV, and on a computer monitor you are talking about a true IMAX experience which could be had much cheaper with a future headset. Anywho, maybe once a few 4K TVs and monitors get out on the market we can have a sort of round-up or buyer's guide to set things straight?
  • -1 Hide
    daglesj , September 19, 2013 7:13 AM
    So those of us married, living with a partner or not still living with our parents need not apply then?

    I think there is a gap in the market for a enthusiast PC website that caters to those who live in the real world with real life budgets.
  • 1 Hide
    mapesdhs , September 19, 2013 7:25 AM

    Just curious Chris, with the CPU not oc'd, are you sure there are no CPU
    bottlenecks going on anywhere? Wondering whether an oc'd 4960X (as
    I'm sure most who'd buy that chip would do) could help in any of the test
    scenarios, inparticular Crysis3, though I see you do highlight Skyrim as
    being one test that's platform-bound.

    Ian.

  • 6 Hide
    CaedenV , September 19, 2013 7:36 AM
    Quote:
    So no-one figures that benching a 4K monitor at lower settings with weaker GPUs would be a good feature and reference for anyone who wants to invest in one soon, but doesn't have anything stronger than a GTX770? Gees, finding that kind of information is proving difficult.


    There are a few reasons:
    1) If you can afford a $3,000 TV then you ought to be able to afford a decent GPU or two, making your argument seem kinda silly.

    2) More resolution makes detail MUCH more important. If you have an image that is (pulls number from ass) 100x100 pixels then that image will always look it's best at that native 100x100 resolution. You can take that image and display it at a lower resolution (say 50x50 pixels) because you are displaying less information than is in the source material. But there is only so much that can be done to display that image at a higher resolution than the source (say 200x200 pixels). You can stretch things out and use AF on it, but at the end of the day you end up with a texture that looks flat, chunky, and out of place.
    We are playing games today that are either console ports aimed at 720p, or native PC games aimed at 1080p. Nither of these are anywhere near 4K resolution, and so an 'ultra' setting for any game out today designed around these resolutions is really a 'basic' setting for what a 4K TV is really capable of. The true 'ultra' test is simply not possible until we get some much larger texture packs designed with 4K in mind.

    3) While some performance can be gained back by dropping a bit of AA and AF, the vast bulk of the performance requirement is dictated by the raw amount of vRAM required, and the sheer 8MP image you are making 30-60 times a second (compared to the 2MP image of a 1080p display).

    4) Next gen consoles are right around the corner which will be loaded with tons of RAM. This ridiculous amount of ram is available because next gen games are going to have much higher resolution textures, and a wider variety of them. On top of that we are going to see a lot more 'clutter' in games to make environments much more unique. All of these objects are going to have their own textures and physics to calculate, which means that yet again that today's 'ultra' settings are simply the 'basic' setting of what is coming in just 1 year.


    So if you want to do 4K gaming then you need to afford the monitor, a duel head GPU setup, and be prepared to replace that duel head GPU setup in a year or two when next gen games simply become far too much for today's GPU capabilities. However, you do not need this raw horsepower to run a desktop, or to watch 4K video as even today's onboard GPUs can handle those tasks just fine at 4K. But if you want to be on the bleeding edge, you are simply going to have to bleed a bit, or else be like the rest of us and wait another year (or three) when the price drops and the GPUs catch up.
  • 8 Hide
    vertexx , September 19, 2013 7:37 AM
    Great article, but what I really want to know is...

    How AWESOME was it playing BF3 or Crisis3 with dual Titans at 4k?? Is it better than 3x1080s in surround? How much so? It's like you just had a morning in a Ferrari Enzo at Laguna Seca and just showed us charts of max G's and velocity time variance. I want to know what it's like to drive that rig!

    And get some hi-res packs plus ENB for running Skyrim already!!
  • -3 Hide
    DBGT_87 , September 19, 2013 7:43 AM
    why do I have to spent some money for lowing my fps?
  • 2 Hide
    cheesyboy , September 19, 2013 7:52 AM
    It's kicking off between AMD and Nvidia on this 4k business. PC Perspective basically being accused of being and Nvidia shill;
    http://www.brightsideofnews.com/news/2013/9/18/nvidia-launches-amd-has-issues-marketing-offensive-ahead-of-hawaii-launch.aspx

    and the offending article;
    http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Eyefinity-vs-Surround-Single-and-Multi-GPU-Configurations/AMD-Ey

    And the twitter note from Roy Taylor of AMD;
    https://twitter.com/amd_roy
  • 0 Hide
    Roger Wilco , September 19, 2013 8:00 AM
    I guess I might take the plunge, maybe in a few weeks!! :D 
    As far as AA goes, ya, I have a ZR30W and AA makes gaming more "comfortable" on the eyes. It's already 2560x1600, but with AA on, it's difference *can be seen.
  • 4 Hide
    vmem , September 19, 2013 8:10 AM
    until they put an HDMI2.0 port in both the displays and my GPU (or have the display support 4K at 60hz through Display Port), I am staying out of this 4K business.

    cost aside, I'm not going to spend top dollar on something that essentially runs as synced split-screen, and requires some sort of SLI or Crossfire system to get playable rates. by the time GPU technology advances enough, we can probably get a better quality 4K OLED at the cost of that ASUS panel
  • -1 Hide
    master9716 , September 19, 2013 8:10 AM
    Higher Res wont do much for gaming , Id rather have 5760x1080 ultra wide view angels with 120hz than a single 4k display ,
  • 3 Hide
    cangelini , September 19, 2013 8:13 AM
    Quote:

    Just curious Chris, with the CPU not oc'd, are you sure there are no CPU
    bottlenecks going on anywhere? Wondering whether an oc'd 4960X (as
    I'm sure most who'd buy that chip would do) could help in any of the test
    scenarios, inparticular Crysis3, though I see you do highlight Skyrim as
    being one test that's platform-bound.

    Ian.



    Hey Ian,
    I was expecting this to be graphics-limited across the board. Skyrim didn't quite surprise me. I would have thought Grid 2 would have been the next-most-likely to demonstrate a processor bottleneck. Crysis 3, particularly at those higher settings, seems less likely to be platform-bound. Great idea for a follow-up, though (same for the suggestion that we evaluate quality without AA to see if it's perceived as necessary with 8.3 MP--thanks for that one).
  • 0 Hide
    cangelini , September 19, 2013 8:14 AM
    Quote:
    until they put an HDMI2.0 port in both the displays and my GPU (or have the display support 4K at 60hz through Display Port), I am staying out of this 4K business.

    cost aside, I'm not going to spend top dollar on something that essentially runs as synced split-screen, and requires some sort of SLI or Crossfire system to get playable rates. by the time GPU technology advances enough, we can probably get a better quality 4K OLED at the cost of that ASUS panel


    You'd be fine with DisplayPort, too. The limitation isn't the interface, it's the hardware inside the monitor.
Display more comments