Maximum Screen Real Estate With A Minimum Of Cash
If you're not familiar with multi-monitor gaming, the concept is simple: three displays connected side-by-side are used as one large screen by the graphics subsystem, giving you a wide view of the environment and pulling you deeper into the game.
Let's consider the advantages of three 1080p monitors over a single 4K panel. A triple-screen setup offers greatly improved peripheral perspective, which aligns with the way that human beings process visual information. Second, three 1920x1080 monitors cumulatively have one-quarter fewer pixels than an Ultra HD monitor at 3840x2160, translating to a lighter graphics load and, ultimately, higher frame rates. Finally, and this is where budget is affected, you can pick up three new 20-23” LCDs for less than $400. Meanwhile, a 4K display starts in the $500 range. And that's for a 30Hz screen. You want 60Hz, nudging the price tag up even higher. A multi-monitor setup also shines when it comes to productivity.
What about the negatives? There is more equipment involved in a triple-monitor setup, obviously. The panels not only take up more space, but are more difficult to arrange than one screen on your desk. Not all games are designed with multi-monitor compatibility in mind. There are also some detractors who think gamers can become less effective because the extra screen real estate is distracting, though I don't agree. In my opinion, the positives greatly outweigh the negatives. More screens equal more fun!
Since we know that three 1080p displays are usually cheaper than a 4K monitor, we're pursuing the budget-oriented approach to multi-monitor gaming. Today, we benchmark two sub-$150 graphics cards to see if they can suitably drive a trio of monitors.
Budget Multi-Monitor Graphics Cards
Gigabyte supplied both of the budget graphics cards for this story, one Radeon R7 260X and one GeForce GTX 750 Ti. Physically, they appear remarkably similar. The best way to tell them apart is that the Radeon has a CrossFire connector on top of the card, while the GeForce has no SLI bridge. They are roughly the same size, built on blue PCBs and topped with related cooling solutions. The rear I/O brackets even come close to matching. One difference that surprised me was the GeForce card's two HDMI connectors compared to the Radeon's HDMI and DisplayPort outputs.
The GeForce GTX 750 Ti sports 640 CUDA cores and 2GB of GDDR5 on a 128-bit bus. Expect to find the card selling for just under $150, though you can find rebates to bring the price down. An efficient architecture is perhaps the GPU's most storied advantage. In fact, Nvidia's reference card doesn't even need an auxiliary power input. Gigabyte's version does have a six-pin input though, which could benefit overclocking headroom.
In the other corner, we have AMD's Radeon R7 260X with 896 Stream processors and 2GB of GDDR5 also on a 128-bit bus. You'll find it around $130, and can get a better deal by searching out rebates and bundled games. Whereas Nvidia's multi-monitor support is branded as Surround, AMD's is called Eyefinity. Both work well after years of improvements, though Eyefinity is perhaps not as easy to configure (though it does facilitate more flexibility when it comes to monitors with different resolutions).