Sign in with
Sign up | Sign in

Clock Rates And Lethal Boost Mode

Sapphire Toxic HD 7970 GHz Edition Review: Gaming On 6 GB Of GDDR5
By

Overclocking with the Push of a Button

Engaging Lethal Boost mode is pretty simple: push the Lethal Boost button while the computer is running and then re-boot. The Toxic HD 7970 GHz Edition loads its settings from a second firmware containing higher clock frequencies and a more aggressive fan profile.

First, we have the original factory settings:

Then we push the Lethal Boost button. In the picture below, the button is to the left of the CrossFire connectors.

Lethal Boost Mode now gives us higher clock rates and a new PowerTune limit that extends to 50 percent (from 20).

When the Radeon HD 7970 GHz Edition launched, we discovered that if the card gets overclocked (or underclocked) by only 1 MHz, it holds that frequency, behaving like an ordinary Radeon HD 7970. It doesn’t throttle down, though, even when it's idle. The card sucks down 50 W or so, and ZeroCore Power stops working. We use the fixed clock rate to compare Sapphire's Toxic HD 7970 GHz Edition 6 GB to older cards.

Our Benchmark System


Benchmark System (Open Case)
CPU
Core i5-2500K (Sandy Bridge), Overclocked to 4.5 GHz
Cooler
Prolimetech Super Mega + Noiseblocker Multiframe
Memory
4 x 4 GB Kingston HyperX DDR3-1600
Motherboard   
Gigabyte Z68X UD7-B3, Z68 Express
Operating System and Driver
Windows 7 Ultimate x64
Catalyst 12.6 WHQL
Case
Lian Li PC-T60A ATX Test Bench


First Benchmarks At Stock Settings

We started by taking a quick look at how the Sapphire Toxic HD 7970 GHz Edition 6 GB fares at its stock settings. This gives us a reference point for our more in-depth benchmarks.

Our second test, Metro 2033, was also run at 1201 MHz, fixing its clock rate in place.

At a resolution of 2560x1440 pixels, Sapphire's Toxic HD 7970 GHz Edition 6 GB and Gigabyte's Radeon HD 7970 Super Overclock with 3 GB perform about the same. The Gigabyte card does manage to deliver better power consumption results compared to Sapphire's board at this fixed clock rate, interestingly enough.

Display all 46 comments.
This thread is closed for comments
Top Comments
  • 32 Hide
    Youngmind , September 4, 2012 4:55 AM
    The 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
  • 15 Hide
    FormatC , September 4, 2012 6:19 AM
    Quote:
    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire.
    Sapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears ;) 
  • 14 Hide
    tpi2007 , September 4, 2012 5:44 AM
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.


    This.

    BigMack70Would be very interested in seeing this in crossfire at crazy resolutions compared to a pair of 3GB cards in crossfire to see if the vram helps in that case


    And this.

    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.

    So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.
Other Comments
  • 32 Hide
    Youngmind , September 4, 2012 4:55 AM
    The 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
  • 6 Hide
    robthatguyx , September 4, 2012 5:10 AM
    i think this would perform much better with a trifire.if one 7970 reference can handle 3 screens than 3 of these could easily eat 6 screen,in my op
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.

  • 8 Hide
    palladin9479 , September 4, 2012 5:20 AM
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.


    Seeing as in both SLI and CFX memory contents are copied to each card, you would practically need that much for ridiculously large screen playing. One card can not handle multiple screens as this was designed for, you need at least two for a x4 screen and three for a x6 screen. The golden rule seems to be two screens per high end card.
  • 14 Hide
    tpi2007 , September 4, 2012 5:44 AM
    YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.


    This.

    BigMack70Would be very interested in seeing this in crossfire at crazy resolutions compared to a pair of 3GB cards in crossfire to see if the vram helps in that case


    And this.

    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.

    So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.
  • 15 Hide
    FormatC , September 4, 2012 6:19 AM
    Quote:
    Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire.
    Sapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears ;) 
  • 3 Hide
    jupiter optimus maximus , September 4, 2012 6:23 AM
    tpi2007This.And this.Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.

    Why not go to the uber-extreme and have crossfire X (4gpus) with six 2500X1600 monitors and crank up the AA to 4x super sampling to prove once and for all in stone.
  • -1 Hide
    esrever , September 4, 2012 6:35 AM
    The normal 7970s seem much better than the ghz edition.
  • 2 Hide
    freggo , September 4, 2012 6:55 AM
    FormatCSapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears


    Thanks for the review. The noise demo alone helps in making a purchase decission.
    No sale !

    Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?
    Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.

    Or are there cards like that ?


  • 2 Hide
    FormatC , September 4, 2012 7:03 AM
    For noise comparison between oc'ed Radeons HD 7970 take a look at this:
    http://www.tomshardware.de/Tahiti-XT2-HD-7970-X-X-Edition,testberichte-241091-6.html
  • 3 Hide
    dudewitbow , September 4, 2012 7:06 AM
    freggoThanks for the review. The noise demo alone helps in making a purchase decission.No sale !Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.Or are there cards like that ?


    I think that has been applied to laptops, but not on the desktop scene. One of the reasons why I would think its not as useful on a desktop scene is even if your build has stuff off, the PSU is the least efficient when on near 0% load, so no matter what, your still going to burn electricity just by having the computer on. All gpus nowandays have downclocking features when its not being on load(my 7850 downclocks to 300mhz on idle) but I wouldnt think cards will go full out 0.
  • 0 Hide
    mesab66 , September 4, 2012 7:27 AM
    Nice review. However, most of us would have been able to work out the benchmarks in our heads - we've all seen similar reviews and understand that, beyond a minimum, more memory in a single card setup makes little/no difference. The company is trying to lure us into a multi-card setup - hoping that the memory benifits there override/mask out the obvious significant noise issue.

    If these companies - or ourselves - can tackle the noise then such card's traget senario would be realised. Of course, even here, for the rest of us mere mortals we still have one more significant 'hurdle'........cost......so we'll keep waiting.
  • 2 Hide
    spat55 , September 4, 2012 9:36 AM
    far too much money, would rather buy a TV with that!
  • 0 Hide
    hellfire24 , September 4, 2012 10:25 AM
    for a single monitor,this is waste! 3GB is enough.
  • 3 Hide
    spentshells , September 4, 2012 11:31 AM
    This did not feel like an enthusiast review.

    I realize it is expensive but the review needed another tox for CF and some proper monitors.

    Lack luster to say the least. Nothing to say wow about here, I have a feeling 2 or 3 these in cf on water or phase change is something I will have to hunt down myself on the internet
  • 0 Hide
    shin0bi272 , September 4, 2012 11:48 AM
    Would like to see a comparison at 3 monitors against a Galaxy GeForce GTX 680 SOC White Edition ...

    You know since an even number of monitors means youre looking at a lovely seam running right down the middle of your view.
  • 3 Hide
    Yuka , September 4, 2012 2:05 PM
    "All of that makes Sapphire's Toxic HD 7970 GHz Edition an answer in search of a problem. We can’t think of a usage scenario for which we’d recommend it. If you really dig the effort Sapphire put into its Vapor-X cooling solution, we recommend you check out the Vapor-X HD 7970 GHz Edition 3 GB card, and use the difference to take your better half out to a nice dinner."

    Really? I can think Crossfire and use 6x30" monitors. If you're getting an expensive set up, why not go all the way? This is the Beyron of setups after all!

    And yes, it was a very unfair conclusion, since regular people is no the target customer of this kind of card. I thought Toms had more enthusiast blood.

    Cheers!
  • 0 Hide
    myufox , September 4, 2012 2:22 PM
    I heard the same thing said about 1GB and 2GB. Hang tight for a year or so and 6GB will be there. Also high resolution screens like the retina displays will demand much much more from our graphics cards, as they become more popular. Only just now has IPS panels really start becoming in demand for computer monitors thanks to Ipad, and next will be the retina displays.
  • 1 Hide
    rebel1280 , September 4, 2012 2:24 PM
    Why not just hook it up to a 1080P projector 0.o ... or THREE 1080P projectors?!!!! that would be awesome!
  • 0 Hide
    rthorington , September 4, 2012 2:29 PM
    How effective would the 6GB of video RAM be for Microsoft's RemoteFX (giving multiple users -- single or possibly dual displays)?
  • -1 Hide
    rmerwede , September 4, 2012 3:06 PM
    Who the heck would run at that rez in Eyefinity 6? Why not 5760x2160?
Display more comments