Sign in with
Sign up | Sign in

Eyefinity: A Tangible Benefit, Today

ATI Radeon HD 5870: DirectX 11, Eyefinity, And Serious Speed
By , Fedy Abi-Chahla

When you go down a list of the fastest single graphics card in the world’s specifications, display outputs just don’t jump out at you as the reason to sink nearly $400 into a high-end GPU. If you only use one monitor, that’s understandable. But I'll tell you right now that you’re missing out. For several years I used four LCDs to work across, and always had at least one window up on each. I’ve since stepped back to three LCDs, but even as I type, I have 10 Firefox windows open, this story in Word, Outlook, two emails, Excel, Adobe Acrobat, and a Trillian conversation—more content than will even fit on three displays. In order to achieve this, I have an embarrassing combination of Radeon HD 4850 and Radeon X1650 cards plugged in (hey, I need everything else for benchmarking, and my Quadro NVS 440 doesn’t game).

H.A.W.X., 5760x1200 across three 24 inch displays.H.A.W.X., 5760x1200 across three 24 inch displays.

ATI’s Eyefinity technology is the answer. The days of two DACs and a pair of TMDS transmitters constituting the entire display pipeline are gone. The Radeon HD 5870 features three independent display outputs, divided out across two dual-link DVI ports, an HDMI output, and a DisplayPort connector. In a standard desktop environment, you’d use the two DVI and single DisplayPort outputs.

DisplayPort is actually the enabler behind all of this. Unlike DVI, DisplayPort doesn’t require a dedicated clocking source for each output. So, all of the chips in ATI’s Evergreen family (with the exception of the lowest-end models) have the capability built-in to drive as many as six DisplayPorts given six on-die display controllers. The Radeon HD 5870, specifically, taps Cypress’ two internal clocking sources, plus one of the DisplayPort pipelines, to yield its three independent display outputs. The above diagram shows the output configurations Eyefinity is capable of supporting.

That’s cool and all, but it’s nothing compared to the Radeon HD 5870 Eyefinity⁶ Edition, which takes full advantage of the GPU’s DisplayPort pipelines. Using six mini-DisplayPort outputs, the card lets you drive as many monitors—right up to the largest 30” LCDs. Theoretically, Eyefinity supports up to an 8192x8192 maximum aggregate screen resolution. But a sextet of 30 inchers running 2560x1600 really only totals 7680x3200. You can expect the Eyefinity⁶ card to launch sometime between now and the end of the year. By the time that happens, you can expect to see Samsung ultra-thin bezel LCDs that make the break between monitors in a multi-display configuration less jarring.

Using Eyefinity

Alright, I feel silly typing Eyefinity over and over, so it’s Radeon HD 5870 from here on out—hopefully by now you know the card gives you three display outputs.

ATI was generous enough to let me borrow a trio of gorgeous Dell U2410s, which I set on the one place certain to get hit me hit in the head by my wife: the dining room table. It was so worth it, though, and the three-output thing isn’t even new to me. I saw ATI’s display technology at the press briefing for 5870, but it’s entirely different to sit down in front of three 24” LCDs in your own home and play games, run benchmarks, and mess with settings.

Configuring three displays to run in single large surface mode is intuitive under Windows 7 (it’ll also work under Windows Vista and Linux). I set my desktop to 5760x1200 and tiled the background to keep things familiar. This is the way you’d want to game, but it’s not the way you would want to use a computer. In SLS mode, you literally have the equivalent of one huge desktop. Maximize Outlook and you span the app across the whole surface. ATI says you can set hotkeys to toggle between SLS and independent display modes, but this must not be as intuitive, because I couldn’t make it happen. Criticism number one: there needs to be a smoother way to swap between an independent display “productivity” mode and gaming across the single large surface.

My first order of business after getting the desktop configured was firing up H.A.W.X. The relatively-fluid flight sim lends itself to the lower frame rates experienced at 5760x1200. With High detail settings and anti-aliasing turned off I was still getting 44 frames per second. If you’ve had a hard time getting into the game up until now, this is the way to do it. The visual experience is truly stunning. I immediately started running through the other games in our benchmark suite:

Game Benchmarks, Single Radeon HD 5870 @ 5760x1200 (No AA / No AF), in FPS
H.A.W.X.
44
Far Cry 2
54.12
Left 4 Dead
81.04
S.T.A.L.K.E.R.: Clear Sky
28.8
World in Conflict
Crash
Resident Evil 5
Not Compatible
Grand Theft Auto IV
29.61


Originally I had planned for that to be a three-column chart; when I saw S.T.A.L.K.E.R. at 28 frames and Grand Theft Auto IV at 29 frames, my first thought was: now here’s a reason to buy two Radeon HD 5870s. But CrossFire doesn’t yet work. ATI’s driver team is looking at the issue, it says, but there is no ETA on when a pair of 5870s might be used to bolster performance further. Criticism number two: no CrossFire? Really?

For the time being, games like H.A.W.X., Far Cry 2, and Left 4 Dead are at least playable on a single Radeon HD 5870, which is still amazing when you consider the last triple-head graphics card to cross our test bench was Matrox’s Parhelia, a card that ambitiously offered triple-output gaming and an underpowered graphics processor. But you do have to turn off extras like anti-aliasing in order to maintain reasonable frame rates with the 5870.

Here’s my last nit-pick: the 1” of bezel between each pair of displays isn’t really an issue when you’re working on the Windows desktop, but it’s certainly more distracting in gaming environments. Samsung’s ultra-thin bezels can’t come soon enough. When they do (in their single, 3x1, and 3x2 configurations), Eyefinity will become that much sexier.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 214 comments.
This thread is closed for comments
Top Comments
  • 26 Hide
    hispeed120 , September 23, 2009 4:13 AM
    I'm. So. Excited.
  • 23 Hide
    cangelini , September 23, 2009 4:43 AM
    viper666why didn't they thest it against a GTX 295 rather than 280??? its far superior...


    Ran it against a GTX 295 and a 285 and 285s in SLI :) 
  • 22 Hide
    megamanx00 , September 23, 2009 4:48 AM
    O M F G!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    Just wish the darn thing wasn't so big, but man, what a card! Now I'm thinking about a bigger case :D 
Other Comments
  • 26 Hide
    hispeed120 , September 23, 2009 4:13 AM
    I'm. So. Excited.
  • 9 Hide
    Anonymous , September 23, 2009 4:15 AM
    Can't wait
  • 21 Hide
    crosko42 , September 23, 2009 4:21 AM
    So it looks like 1 is enough for me.. Dont plan on getting a 30 inch monitor any time soon.
  • 20 Hide
    jezza333 , September 23, 2009 4:29 AM
    Looks like the NDA lifted at 11:00PM, as there's a load of reviews now just out. Once again it shows that AMD can produce a seriously killer card...

    Crysis 2 on an x2 of this is exactly what I'm waiting for.
  • 8 Hide
    woostar88 , September 23, 2009 4:38 AM
    This is incredible at the price point.
  • 20 Hide
    tipmen , September 23, 2009 4:40 AM
    wait, wait, before I look can it play cry... HOLY SHIT?!
  • 23 Hide
    cangelini , September 23, 2009 4:43 AM
    viper666why didn't they thest it against a GTX 295 rather than 280??? its far superior...


    Ran it against a GTX 295 and a 285 and 285s in SLI :) 
  • 2 Hide
    Annisman , September 23, 2009 4:44 AM
    I refuse to buy until the 2GB versions come out, not to mention newegg letting you buy more than 1 at a time, paper launch ftl.
  • 15 Hide
    jasperjones , September 23, 2009 4:44 AM
    Thanks for the timely review. I have to say though, some of the technical details are beyond me. It'd be useful if you explained terms such as "VLIW architecture" or "tessellation engine"
  • 22 Hide
    megamanx00 , September 23, 2009 4:48 AM
    O M F G!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    Just wish the darn thing wasn't so big, but man, what a card! Now I'm thinking about a bigger case :D 
  • 17 Hide
    Annisman , September 23, 2009 4:49 AM
    Oops, who am I kidding ? I just ordered 2 5870's. One Sapphire, and one HIS, seeing as how they limit you to one per customer.
  • 18 Hide
    falchard , September 23, 2009 4:54 AM
    I think most of this review has to do with how many games are optimized for nVidia. The Crytek Engine 2.0 and Source Engine are well known for heavily favoring nVidia architecture yet compose the bulk of the benchmarks. I think the fact ATI can do best in these engines when they have a detect ATI instant nerf its performance speaks measures for the actual card.
  • 14 Hide
    tipmen , September 23, 2009 4:56 AM
    Another thing is that the 5800x2 isn't out yet, now think of two of those bad boys in Crossfire.
  • 6 Hide
    blackbyron , September 23, 2009 4:59 AM
    Not bad for Crysis benchmark. I really want 5870 for my christmas present, but damn I also need to buy a new PSU.
  • 12 Hide
    blackbyron , September 23, 2009 5:02 AM
    In addition, I am impressed that the 5870 has a better power consumption and better gaming performance compare to DX10 cards. If the card is affordable I'd definite buy one.
  • 3 Hide
    cangelini , September 23, 2009 5:10 AM
    jasperjonesThanks for the timely review. I have to say though, some of the technical details are beyond me. It'd be useful if you explained terms such as "VLIW architecture" or "tessellation engine"


    Jasper,
    TBH, the architectural details are secondary to how the card performs. However, if you'd like a better idea of what tessellation can do for you, check out the picture of the Alien on page six!
Display more comments