How Realistic Is Multi-Monitor Gaming On A Budget?

Maximum Screen Real Estate With A Minimum Of Cash

If you're not familiar with multi-monitor gaming, the concept is simple: three displays connected side-by-side are used as one large screen by the graphics subsystem, giving you a wide view of the environment and pulling you deeper into the game.

Let's consider the advantages of three 1080p monitors over a single 4K panel. A triple-screen setup offers greatly improved peripheral perspective, which aligns with the way that human beings process visual information. Second, three 1920x1080 monitors cumulatively have one-quarter fewer pixels than an Ultra HD monitor at 3840x2160, translating to a lighter graphics load and, ultimately, higher frame rates. Finally, and this is where budget is affected, you can pick up three new 20-23” LCDs for less than $400. Meanwhile, a 4K display starts in the $500 range. And that's for a 30Hz screen. You want 60Hz, nudging the price tag up even higher. A multi-monitor setup also shines when it comes to productivity.

What about the negatives? There is more equipment involved in a triple-monitor setup, obviously. The panels not only take up more space, but are more difficult to arrange than one screen on your desk. Not all games are designed with multi-monitor compatibility in mind. There are also some detractors who think gamers can become less effective because the extra screen real estate is distracting, though I don't agree. In my opinion, the positives greatly outweigh the negatives. More screens equal more fun!

Since we know that three 1080p displays are usually cheaper than a 4K monitor, we're pursuing the budget-oriented approach to multi-monitor gaming. Today, we benchmark two sub-$150 graphics cards to see if they can suitably drive a trio of monitors. 

Budget Multi-Monitor Graphics Cards


Gigabyte supplied both of the budget graphics cards for this story, one Radeon R7 260X and one GeForce GTX 750 Ti. Physically, they appear remarkably similar. The best way to tell them apart is that the Radeon has a CrossFire connector on top of the card, while the GeForce has no SLI bridge. They are roughly the same size, built on blue PCBs and topped with related cooling solutions. The rear I/O brackets even come close to matching. One difference that surprised me was the GeForce card's two HDMI connectors compared to the Radeon's HDMI and DisplayPort outputs.

The GeForce GTX 750 Ti sports 640 CUDA cores and 2GB of GDDR5 on a 128-bit bus. Expect to find the card selling for just under $150, though you can find rebates to bring the price down. An efficient architecture is perhaps the GPU's most storied advantage. In fact, Nvidia's reference card doesn't even need an auxiliary power input. Gigabyte's version does have a six-pin input though, which could benefit overclocking headroom.

AMD Radeon R7 260X

In the other corner, we have AMD's Radeon R7 260X with 896 Stream processors and 2GB of GDDR5 also on a 128-bit bus. You'll find it around $130, and can get a better deal by searching out rebates and bundled games. Whereas Nvidia's multi-monitor support is branded as Surround, AMD's is called Eyefinity. Both work well after years of improvements, though Eyefinity is perhaps not as easy to configure (though it does facilitate more flexibility when it comes to monitors with different resolutions).

  • AndrewJacksonZA
    Thank you very, very much for this article Jason. I also thought that Eyefinity/Surround was only for the rich. This might be affecting my upgrade decision. :-)
  • leeb2013
    Great article. Triple monitor gaming was one of the best things I did, nicely fills your horizontal fov. Shame some recent aaa games still don't support it! 3 2nd hand Samsung monitors, 70 bucks each, R9 290 which is even cheaper now. Get a fairly solid 60fps with low aa and occasionally have to drop the quality settings for these recent poor console ports we're getting.
  • CaedenV
    Its neat to see this working so well on a more budget conscious system. I remember back when this tech was first getting started and even on the high-end of the market it seemed much more work than it was worth.

    For my work computer I will never again go back to having less than 2 displays (though the 3rd tends to get significantly less use, it is handy to have at times). When working with lots of office apps and web browsers it is extremely nice to have everything up at once where you can see it. Even at home when doing work at home I tend to use my desktop display as a 2nd for my gutless laptop rather than using the workhorse gaming/editing rig (plus, the laptop can't game... so less distracting).
    But for gaming, I absolutely prefer a single large high-quality monitor to having 3 'normal' sized ones. I mean, if I could afford 3 high-end displays (and the GPU horsepower to drive them) then I would absolutely go for that. But as a general rule of thumb, at a given budget I find that having a single display that is as large and as nice as possible is much more enjoyable than having 3 mediocre displays.
  • Grognak
    Nearly $900 before buying the screens and GPU. I guess we have different definitions of "budget". Don't know why you would bother getting multi-monitor with a cheap GPU, too; as your benchmarks show, most games get barely playable FPS, and that's with low settings. It was interesting but it really doesn't convince me that this kind of rig is affordable.
  • simon4ok
    I am really interested in knowing how did you manage to connect the monitors to the graphics card? Given that you only had 2 DVIs, HDMI and DisplayPort, which ports did you use for the 3 monitors? And did you convert DVI to HDMI or the other way around? And does that make any difference on these resolutions (1920x1080 per monitor)?
  • eldragon0
    wh3resmycar, I'm assuming you mean for a budget system, is my build, and I'd never drop below 4 monitors again. Ultra wide is ok for gaming, but productivity, multiple screens are needed.
  • damianrobertjones
    @wh3resmycar: Why do you feel the need to insult people? I have an ultrawide Dell U2913wm and, in all honesty, I'd gladly move back to a 3xscreen setup. Work can be done, left alone on its own screen and then... gaming.

    Just because YOU do not see the need for something doesn't make the people that do idiots.
  • marciocattini
    Man I'm left wondering whether this article should have included a dual and triple sli/crossfire with the same cards to see if you could achieve a high level of detail...
    I mean 3 radeon 260's and nvidia 750's TI is still pretty cheap compared to a high end card!!
  • Sakkura
    You might want to explain why there's no FCAT data, only the highly unreliable benchmark data from Fraps.
  • Evolution2001
    I'm not entirely sure why you even need to bother with the AMD's and nVidia's config software. Both Win7 and 8 natively support three monitor setups, and with monitors of different resolution/size.
    I'm at work in my cubicle and I'm currently writing this on a basic HP Probook 6575b which uses the AMD A6-4400M APU (Radeon HD7520G video chipset) connected to HP's docking station. I have (2) Samsung B2230's side by side and an HP P221 hanging on the wall in a portrait orientation.
    Using basic Windows configuration, I have the P221 monitor offset so bottom portion is about 2" above the bottom of the Samsungs, and extends about 10" above. Even though the physical screen of the P221 does not align with the Samsungs, when I drag windows across the monitors, they stay exactly lined up; they don't get that disconnected offset. (I realize it may be hard to visualize).

    The point being is that Windows' native monitor resolution settings will allow you do the basics needed to configure your three monitor setup as well as set things like offset and orientation. I think what throws off most people is that they don't realize that in the screen resolution settings, you can drag your monitors around and reorder and align them as needed.

    One other tip... rather than have your monitors bezels next to each other, find your normal seating position. Now take your outer monitor and move the bezel so it's directly behind the middle monitor's bezel. Effectively you should be able to achieve the visual perception of only a single bezel divide instead of a double-wide.