ATI Radeon HD 5870: DirectX 11, Eyefinity, And Serious Speed

Eyefinity: A Tangible Benefit, Today

When you go down a list of the fastest single graphics card in the world’s specifications, display outputs just don’t jump out at you as the reason to sink nearly $400 into a high-end GPU. If you only use one monitor, that’s understandable. But I'll tell you right now that you’re missing out. For several years I used four LCDs to work across, and always had at least one window up on each. I’ve since stepped back to three LCDs, but even as I type, I have 10 Firefox windows open, this story in Word, Outlook, two emails, Excel, Adobe Acrobat, and a Trillian conversation—more content than will even fit on three displays. In order to achieve this, I have an embarrassing combination of Radeon HD 4850 and Radeon X1650 cards plugged in (hey, I need everything else for benchmarking, and my Quadro NVS 440 doesn’t game).

H.A.W.X., 5760x1200 across three 24 inch displays.H.A.W.X., 5760x1200 across three 24 inch displays.

ATI’s Eyefinity technology is the answer. The days of two DACs and a pair of TMDS transmitters constituting the entire display pipeline are gone. The Radeon HD 5870 features three independent display outputs, divided out across two dual-link DVI ports, an HDMI output, and a DisplayPort connector. In a standard desktop environment, you’d use the two DVI and single DisplayPort outputs.

DisplayPort is actually the enabler behind all of this. Unlike DVI, DisplayPort doesn’t require a dedicated clocking source for each output. So, all of the chips in ATI’s Evergreen family (with the exception of the lowest-end models) have the capability built-in to drive as many as six DisplayPorts given six on-die display controllers. The Radeon HD 5870, specifically, taps Cypress’ two internal clocking sources, plus one of the DisplayPort pipelines, to yield its three independent display outputs. The above diagram shows the output configurations Eyefinity is capable of supporting.

That’s cool and all, but it’s nothing compared to the Radeon HD 5870 Eyefinity⁶ Edition, which takes full advantage of the GPU’s DisplayPort pipelines. Using six mini-DisplayPort outputs, the card lets you drive as many monitors—right up to the largest 30” LCDs. Theoretically, Eyefinity supports up to an 8192x8192 maximum aggregate screen resolution. But a sextet of 30 inchers running 2560x1600 really only totals 7680x3200. You can expect the Eyefinity⁶ card to launch sometime between now and the end of the year. By the time that happens, you can expect to see Samsung ultra-thin bezel LCDs that make the break between monitors in a multi-display configuration less jarring.

Using Eyefinity

Alright, I feel silly typing Eyefinity over and over, so it’s Radeon HD 5870 from here on out—hopefully by now you know the card gives you three display outputs.

ATI was generous enough to let me borrow a trio of gorgeous Dell U2410s, which I set on the one place certain to get hit me hit in the head by my wife: the dining room table. It was so worth it, though, and the three-output thing isn’t even new to me. I saw ATI’s display technology at the press briefing for 5870, but it’s entirely different to sit down in front of three 24” LCDs in your own home and play games, run benchmarks, and mess with settings.

Configuring three displays to run in single large surface mode is intuitive under Windows 7 (it’ll also work under Windows Vista and Linux). I set my desktop to 5760x1200 and tiled the background to keep things familiar. This is the way you’d want to game, but it’s not the way you would want to use a computer. In SLS mode, you literally have the equivalent of one huge desktop. Maximize Outlook and you span the app across the whole surface. ATI says you can set hotkeys to toggle between SLS and independent display modes, but this must not be as intuitive, because I couldn’t make it happen. Criticism number one: there needs to be a smoother way to swap between an independent display “productivity” mode and gaming across the single large surface.

My first order of business after getting the desktop configured was firing up H.A.W.X. The relatively-fluid flight sim lends itself to the lower frame rates experienced at 5760x1200. With High detail settings and anti-aliasing turned off I was still getting 44 frames per second. If you’ve had a hard time getting into the game up until now, this is the way to do it. The visual experience is truly stunning. I immediately started running through the other games in our benchmark suite:

Game Benchmarks, Single Radeon HD 5870 @ 5760x1200 (No AA / No AF), in FPS
Far Cry 2
Left 4 Dead
S.T.A.L.K.E.R.: Clear Sky
World in Conflict
Resident Evil 5
Not Compatible
Grand Theft Auto IV

Originally I had planned for that to be a three-column chart; when I saw S.T.A.L.K.E.R. at 28 frames and Grand Theft Auto IV at 29 frames, my first thought was: now here’s a reason to buy two Radeon HD 5870s. But CrossFire doesn’t yet work. ATI’s driver team is looking at the issue, it says, but there is no ETA on when a pair of 5870s might be used to bolster performance further. Criticism number two: no CrossFire? Really?

For the time being, games like H.A.W.X., Far Cry 2, and Left 4 Dead are at least playable on a single Radeon HD 5870, which is still amazing when you consider the last triple-head graphics card to cross our test bench was Matrox’s Parhelia, a card that ambitiously offered triple-output gaming and an underpowered graphics processor. But you do have to turn off extras like anti-aliasing in order to maintain reasonable frame rates with the 5870.

Here’s my last nit-pick: the 1” of bezel between each pair of displays isn’t really an issue when you’re working on the Windows desktop, but it’s certainly more distracting in gaming environments. Samsung’s ultra-thin bezels can’t come soon enough. When they do (in their single, 3x1, and 3x2 configurations), Eyefinity will become that much sexier.

This thread is closed for comments
    Your comment
  • hispeed120
    I'm. So. Excited.
  • Can't wait
  • crosko42
    So it looks like 1 is enough for me.. Dont plan on getting a 30 inch monitor any time soon.
  • jezza333
    Looks like the NDA lifted at 11:00PM, as there's a load of reviews now just out. Once again it shows that AMD can produce a seriously killer card...

    Crysis 2 on an x2 of this is exactly what I'm waiting for.
  • woostar88
    This is incredible at the price point.
    Err... I thought I was going to see more for the price. Regardless, I think ATI missed the mark here. I am interested in playing games on my HDTV since me and my monitor don't care about these higher resolutions. Fail cakes... Nivida is undoubtedly going to rape ATI in performance with the 300 series. This is good news for mainstream prices however.... you can ptobably upgrade to a current DX10 board soon for a very good price, and then buy a 5850 for $100 in a year from now. Result? Don't but a 5000 series card yet until the price comes down? Heh, I bet the cards will be $100 less in December if the 300 series launches.

    This is not to say I am an Nvidia fan, just undoubtedly you would do well for yourself to hold off for a bit if you want to buy a 5000 series... as the price will come down for a good price/performance ratio soon enough.
  • tipmen
    wait, wait, before I look can it play cry... HOLY SHIT?!
  • viper666
    why didn't they thest it against a GTX 295 rather than 280??? its far superior...
  • cangelini
    viper666why didn't they thest it against a GTX 295 rather than 280??? its far superior...

    Ran it against a GTX 295 and a 285 and 285s in SLI :)
  • Annisman
    I refuse to buy until the 2GB versions come out, not to mention newegg letting you buy more than 1 at a time, paper launch ftl.
  • jasperjones
    Thanks for the timely review. I have to say though, some of the technical details are beyond me. It'd be useful if you explained terms such as "VLIW architecture" or "tessellation engine"
  • viper666
    oh my bad... didn't see the rest of the pages :)
  • megamanx00
    O M F G!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    Just wish the darn thing wasn't so big, but man, what a card! Now I'm thinking about a bigger case :D
  • Annisman
    Oops, who am I kidding ? I just ordered 2 5870's. One Sapphire, and one HIS, seeing as how they limit you to one per customer.
  • falchard
    I think most of this review has to do with how many games are optimized for nVidia. The Crytek Engine 2.0 and Source Engine are well known for heavily favoring nVidia architecture yet compose the bulk of the benchmarks. I think the fact ATI can do best in these engines when they have a detect ATI instant nerf its performance speaks measures for the actual card.
  • charlesxuma
  • tipmen
    Another thing is that the 5800x2 isn't out yet, now think of two of those bad boys in Crossfire.
  • blackbyron
    Not bad for Crysis benchmark. I really want 5870 for my christmas present, but damn I also need to buy a new PSU.
  • blackbyron
    In addition, I am impressed that the 5870 has a better power consumption and better gaming performance compare to DX10 cards. If the card is affordable I'd definite buy one.
  • cangelini
    jasperjonesThanks for the timely review. I have to say though, some of the technical details are beyond me. It'd be useful if you explained terms such as "VLIW architecture" or "tessellation engine"

    TBH, the architectural details are secondary to how the card performs. However, if you'd like a better idea of what tessellation can do for you, check out the picture of the Alien on page six!
  • megamanx00
    Now I wanna see a review of these cards in 4-way crossfire against say triple and 4-way SLI. Of course the power draw and heat would probably be insane :D.
  • bk420
    It looks good so far, but the 5870X2 will be my money's worth :D
  • Proximon
    Thanks Chris,

    I thought your conclusion was well balanced and stated clearer than the other guys... who got their reviews out first. Everyone does seem to agree more or less on performance.
  • Card is ffs huge 8800GTS ultra huge and the ultra had extra length so you can strap it to a HDD cage or some shit haha.

    Cards looks good for it's price seems reasonable. too bad this is only DX10 and we all assume it runs DX9 it should no problem. But i'll hold out till DX11 cards and Nvidia gives something to choose to buy a card. Last time i bought in haste of DX10 8800gts 640mb and got screwed a month or two later when they release 8800 GT that performed better and was cheaper. Also back then ATI 2000's 3000's enthusiast series was nearly a joke in benchmarks compared to nvidia. So I've learned to hold out a bit this shit ain't cheap enough for me to buy sell buy sell.