Skip to main content

Sapphire Toxic HD 7970 GHz Edition Review: Gaming On 6 GB Of GDDR5

Noise Level Comparison Videos

The following videos demonstrate what the acoustic measurements on the previous page sound like in the real world. We’ve ordered them by fan speed and load.

At idle and with a room temperature of 22 degrees Celsius (72 degrees Fahrenheit), the fan speed is at 20 percent of its maximum, resulting in a nice 30.9 dB(A). That sounds fine, but you probably won't ever see it in practice since it can only be achieved without Lethal Boost and in a room with a fairly low temperature.

Idle: 30 Percent Fan Speed at 32.8 dB(A)

Thirty percent is more realistic. This is possible with Lethal Boost enabled, so long as you don't overclock manually. In that case, the Toxic HD 7970 GHz Edition goes all the way to 40 percent or so (and about 40.6 dB[A]) after 30 minutes.

Moderate Load: 50 Percent Fan Speed at 46.8 dB(A)

The fans run at a full 50 percent duty cycle under simple 3D applications like Google Earth with Lethal Boost enabled. Down-clocked to 1100 MHz, the fans only slow to 45 percent or so; they also operate above 50 percent (at 46.8–49.2 dB[A]) during Metro 2033 benchmarking.

Gaming Load with Lethal Boost: 65 – 67 Percent Fan Speed at 54.2 dB(A)

At a fixed 1201 MHz GPU clock rate and 65- to 67-percent fan speeds, we've finally caught up with the noise generated by AMD's reference design. Acoustics are now definitely a problem, more so because the aggressive fan profile isn't even really needed.

Full Load with Factory Settings: 73 Percent Fan Speed at 57.4 dB(A)

Bitmining takes the Toxic HD 7970 GHz Edition 6 GB’s noise from obnoxious to flagrant at a 73-percent duty cycle, which generates about 57.5 dB(A).

Full Load with Lethal Boost: 80 Percent Fan Speed at 60.5 dB(A)

Just when we thought that it couldn’t get any worse, Sapphire's Toxic HD 7970 GHz Edition passes the 60 dB(A) barrier under full load. On the bright side, the card's cooling performance is exceptional. Still, we're not sure a graphics card needs to run cooler overclocked than at its factory clock rates.

Bottom Line

Lethal Boost mode's fan profile is too aggressive, and should be revised. The frequency spikes up and down are also counterproductive.

  • Youngmind
    The 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    Reply
  • robthatguyx
    i think this would perform much better with a trifire.if one 7970 reference can handle 3 screens than 3 of these could easily eat 6 screen,in my op YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    Reply
  • palladin9479
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    Seeing as in both SLI and CFX memory contents are copied to each card, you would practically need that much for ridiculously large screen playing. One card can not handle multiple screens as this was designed for, you need at least two for a x4 screen and three for a x6 screen. The golden rule seems to be two screens per high end card.
    Reply
  • tpi2007
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
    This.

    BigMack70Would be very interested in seeing this in crossfire at crazy resolutions compared to a pair of 3GB cards in crossfire to see if the vram helps in that case
    And this.

    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.

    So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.
    Reply
  • FormatC
    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire.
    Sapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears ;)
    Reply
  • tpi2007This.And this.Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.Why not go to the uber-extreme and have crossfire X (4gpus) with six 2500X1600 monitors and crank up the AA to 4x super sampling to prove once and for all in stone.
    Reply
  • esrever
    The normal 7970s seem much better than the ghz edition.
    Reply
  • freggo
    FormatCSapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears
    Thanks for the review. The noise demo alone helps in making a purchase decission.
    No sale !

    Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?
    Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.

    Or are there cards like that ?


    Reply
  • FormatC
    For noise comparison between oc'ed Radeons HD 7970 take a look at this:
    http://www.tomshardware.de/Tahiti-XT2-HD-7970-X-X-Edition,testberichte-241091-6.html
    Reply
  • dudewitbow
    freggoThanks for the review. The noise demo alone helps in making a purchase decission.No sale !Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.Or are there cards like that ?
    I think that has been applied to laptops, but not on the desktop scene. One of the reasons why I would think its not as useful on a desktop scene is even if your build has stuff off, the PSU is the least efficient when on near 0% load, so no matter what, your still going to burn electricity just by having the computer on. All gpus nowandays have downclocking features when its not being on load(my 7850 downclocks to 300mhz on idle) but I wouldnt think cards will go full out 0.
    Reply