AMD Radeon HD 7970 GHz Edition Review: Give Me Back That Crown!
Power Consumption
Idle power consumption remains one of AMD’s strengths, and its latest effort tops our charts by using less than 100 W of system power sitting on the Windows desktop.
ZeroCore technology favors AMD even more prominently, though the GHz Edition card doesn’t quite drop as low as the already-available Radeon HD 7900-series boards. Even still, that’s an additional 10 W savings when Windows shuts off your monitor.
The Radeon HD 7970 GHz Edition uses more power than a vanilla Radeon HD 7970—not surprising when you consider its faster GPU, higher-clocked memory, and ability to dynamically increase voltage.
Although AMD says the GHz Edition board stays under the original Radeon HD 7970’s TDP, it still averages about 13 W higher system power consumption in our 3DMark demo. That puts it 73 W higher than a Radeon HD 7950 and 43 W higher than a GeForce GTX 680.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Current page: Power Consumption
Prev Page Radeon HD 7970 GHz Edition Gets Our Aftermarket Cooling Treatment Next Page New Drivers Deliver; Radeon HD 7970 Claims A Symbolic WinThere's a budget GeForce GPU selling in China that not even Nvidia knew it made — RTX 4010 turns out to be a modified RTX A400 workstation GPU
US to patch loopholes that allow China to buy banned AI GPUs from other countries — new regulations include national quotas on GPU exports and a global licensing system
-
Darkerson My only complaint with the "new" card is the price. Otherwise it looks like a nice card. Better than the original version, at any rate, not that the original was a bad card to begin with.Reply -
wasabiman321 Great I just ordered a gtx 670 ftw... Grrr I hope performance gets better for nvidia drivers too :DReply -
mayankleoboy1 nice show AMD !Reply
with Winzip that does not use GPU, VCE that slows down video encoding and a card that gives lower min FPS..... EPIC FAIL.
or before releasing your products, try to ensure S/W compatibility. -
vmem jrharbortTo me, increasing the memory speed was a pointless move. Nvidia realized that all of the bandwidth provided by GDDR5 and a 384bit bus is almost never utilized. The drop back to a 256bit bus on their GTX 680 allowed them to cut cost and power usage without causing a drop in performance. High end AMD cards see the most improvement from an increased core clock. Memory... Not so much.Then again, Nvidia pretty much cheated on this generation as well. Cutting out nearly 80% of the GPGPU logic, something Nvidia had been trying to market for YEARS, allowed then to even further drop production costs and power usage. AMD now has the lead in this market, but at the cost of higher power consumption and production cost.This quick fix by AMD will work for now, but they obviously need to rethink their future designs a bit.Reply
the issue is them rethinking their future designs scares me... Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite. True, Nvidia is able to produce more gaming performance for less, but this is pushing anyone who wants GPU compute to get an overpriced professional card. now before you say "well if you're making a living out of it, fork out the cash and go Quadro", let me remind you that a lot of innovators in various fields actually do use GPU compute to ultimately make progress (especially in academic sciences) to ultimately bring us better tech AND new directions in tech development... and I for one know a lot of government funded labs that can't afford to buy a stack of quadro cards -
DataGrave Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite.
100% acknowledge
And for the gamers: take a look at the new UT4 engine! Without excellent GPGPU performace this will be a disaster for each graphics card. See you, Nvidia. -
cangelini mayankleoboy1Thanks for putting my name in teh review now if only you could bold it;-)Reply
Excellent tip. Told you I'd look into it!