
We don’t make any recommendations based on synthetic results; if you can’t play it, you wouldn’t buy a graphics card for it.
In theory, though, 3DMark should allow us to compare AMD’s technology to Nvidia’s without the influence of developer bias resulting from the help that both companies provide in certain titles. If Futuremark is doing its job, the variation you see from one game to another should be largely mitigated here.
Whether or not that’s actually the case, AMD’s Radeon HD 7970 GHz Edition leapfrogs the GeForce GTX 670 and settles in just behind the GeForce GTX 680.
I don’t want to spoil the rest of the results, but you’re going to see that 3DMark doesn’t reflect the majority of our real-world benchmarks. In fact, there’s a trend that I’ll zero in on as we flip through three different resolutions—that is, the 7970 does increasingly well as you ask it to render more pixels and turn up quality settings. Considering that our 3DMark 11 benchmark employs the Extreme preset, I really would have expected it to place AMD’s latest ahead of Nvidia’s GeForce GTX 680.




- Is An Overclocked Radeon HD 7970 Greater Than GeForce GTX 680?
- PowerTune With Boost: Is The Accelerator Stuck?
- Radeon HD 7970 Vs. Radeon HD 7970 GHz Edition
- Overclocking With PowerTune
- Will Your Old 7970 Take A GHz Edition Firmware?
- Test Setup And Benchmarks
- Benchmark Results: 3DMark 11
- Benchmark Results: Battlefield 3 (DX 11)
- Benchmark Results: Crysis 2 (DX 9/11)
- Benchmark Results: The Elder Scrolls V: Skyrim (DX 9)
- Benchmark Results: DiRT 3 (DX 11)
- Benchmark Results: World Of Warcraft: Cataclysm (DX 11)
- Benchmark Results: Metro 2033 (DX 11)
- Benchmark Results: GPU Compute
- Benchmark Results: MediaConverter 7.5
- Temperature And Noise
- Radeon HD 7970 GHz Edition Gets Our Aftermarket Cooling Treatment
- Power Consumption
- New Drivers Deliver; Radeon HD 7970 Claims A Symbolic Win
And for the gamers: take a look at the new UT4 engine! Without excellent GPGPU performace this will be a disaster for each graphics card. See you, Nvidia.
the issue is them rethinking their future designs scares me... Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite. True, Nvidia is able to produce more gaming performance for less, but this is pushing anyone who wants GPU compute to get an overpriced professional card. now before you say "well if you're making a living out of it, fork out the cash and go Quadro", let me remind you that a lot of innovators in various fields actually do use GPU compute to ultimately make progress (especially in academic sciences) to ultimately bring us better tech AND new directions in tech development... and I for one know a lot of government funded labs that can't afford to buy a stack of quadro cards
now if only you could bold it
with Winzip that does not use GPU, VCE that slows down video encoding and a card that gives lower min FPS..... EPIC FAIL.
or before releasing your products, try to ensure S/W compatibility.
the issue is them rethinking their future designs scares me... Nvidia has started a HORRIBLE trend in the business that I hope to dear god AMD does not follow suite. True, Nvidia is able to produce more gaming performance for less, but this is pushing anyone who wants GPU compute to get an overpriced professional card. now before you say "well if you're making a living out of it, fork out the cash and go Quadro", let me remind you that a lot of innovators in various fields actually do use GPU compute to ultimately make progress (especially in academic sciences) to ultimately bring us better tech AND new directions in tech development... and I for one know a lot of government funded labs that can't afford to buy a stack of quadro cards
And for the gamers: take a look at the new UT4 engine! Without excellent GPGPU performace this will be a disaster for each graphics card. See you, Nvidia.
;-)
Excellent tip. Told you I'd look into it!
WoW is meaningful, actually.
New games will make it in when vendors start giving us more than two or three days to retest all of their graphics cards
And here's me, hoping that this kind of competition landscape would still be present in the enthusiast CPU market.
I really hope this is trolling, rather than a calous endeavor to discredit a competitors product.