Sapphire Toxic HD 7970 GHz Edition Review: Gaming On 6 GB Of GDDR5

Building An Eyefinity-Capable System

Eyefinity and Six Monitors

Most cheap monitors only have DVI and HDMI inputs. AMD came out with an Eyefinity Edition Radeon HD 5870, but we haven't seen a card with six outputs since then.

Because we're working with a Radeon HD 7970 from Sapphire, we reached further into our bag of tricks and produced three Sapphire Vid-2X splitters (we also chose not to think about what a setup like this costs).

We used one Vid-2X (PSE-DV2185) to extend the 7970's dual-link DVI output to two monitors and two Vid-2Xes (PSE-DP4196) to do the same for its mini-DisplayPort connectors. Both models offers twin DVI outputs, letting us save some money on the monitors.

The Vid-2X video splitters are based on VESA's Plug & Play standard. They show up as one large display to the graphics card and then divide its output into two signals for the monitors. One advantage they confer is that they're not bound by a panel's native resolution. In certain games it might make sense to use a lower resolution to help with performance, but the picture quality degrades too much to step back like that on the Windows desktop.

This is the guy who helped us at the computer store.

We didn’t have six of the same monitor in our German lab, so we performed four- and six-screen tests at a local computer hardware store. This also gave us a nice audience to perform in front of. I was able to talk about Tom's Hardware and answer a lot of questions while setting up the benchmarks. In the end, we came away with some great feedback on ways to make our 2013 Graphics Card Performance Charts even more interesting.

Of course, the focus of our conversation was the Eyefinity setup, which wouldn’t have been possible without the three splitters or a nice array of monitors.

  • Youngmind
    The 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    Reply
  • robthatguyx
    i think this would perform much better with a trifire.if one 7970 reference can handle 3 screens than 3 of these could easily eat 6 screen,in my op YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    Reply
  • palladin9479
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    Seeing as in both SLI and CFX memory contents are copied to each card, you would practically need that much for ridiculously large screen playing. One card can not handle multiple screens as this was designed for, you need at least two for a x4 screen and three for a x6 screen. The golden rule seems to be two screens per high end card.
    Reply
  • tpi2007
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    This.

    BigMack70Would be very interested in seeing this in crossfire at crazy resolutions compared to a pair of 3GB cards in crossfire to see if the vram helps in that case
    And this.

    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.

    So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.
    Reply
  • FormatC
    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire.
    Sapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears ;)
    Reply
  • tpi2007This.And this.Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.Why not go to the uber-extreme and have crossfire X (4gpus) with six 2500X1600 monitors and crank up the AA to 4x super sampling to prove once and for all in stone.
    Reply
  • esrever
    The normal 7970s seem much better than the ghz edition.
    Reply
  • freggo
    FormatCSapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears
    Thanks for the review. The noise demo alone helps in making a purchase decission.
    No sale !

    Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?
    Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.

    Or are there cards like that ?


    Reply
  • FormatC
    For noise comparison between oc'ed Radeons HD 7970 take a look at this:
    http://www.tomshardware.de/Tahiti-XT2-HD-7970-X-X-Edition,testberichte-241091-6.html
    Reply
  • dudewitbow
    freggoThanks for the review. The noise demo alone helps in making a purchase decission.No sale !Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.Or are there cards like that ?
    I think that has been applied to laptops, but not on the desktop scene. One of the reasons why I would think its not as useful on a desktop scene is even if your build has stuff off, the PSU is the least efficient when on near 0% load, so no matter what, your still going to burn electricity just by having the computer on. All gpus nowandays have downclocking features when its not being on load(my 7850 downclocks to 300mhz on idle) but I wouldnt think cards will go full out 0.
    Reply