Sign in with
Sign up | Sign in
Your question
Closed

Radeon HD 7990 And GeForce GTX 690: Bring Out The Big Guns

Last response: in Reviews comments
Share
November 8, 2012 5:40:50 AM

IMHO, the GTX690 looks best. There is something really alluring about shiny white metallic shine and the fine metal mesh. Along with the fluorescent green branding.
Maybe i am too much of a retro SF buff :) 
November 8, 2012 5:55:28 AM

What's the most impressive is that the GTX 690 was made by nVidia themselves and not an OEM. Very nice and balanced card.
Related resources
November 8, 2012 6:03:57 AM

i wept
November 8, 2012 6:04:02 AM

your test system is sexy!!!!!!!
November 8, 2012 6:05:43 AM

You can't really go wrong either way with these generally insane(so to speak) cards.
November 8, 2012 6:34:43 AM

Is it just me or do the 7970X2 and 7990 coolers look so fast and fugly? :heink: 
Anonymous
November 8, 2012 6:43:17 AM

thanks for the in depth analysis with adaptive V-sync and radeon pro helping with micro stutter.

not to take away anything for the hard work performed; i would have liked have seen nvidia's latest beta driver, 310.33, included also to see if nvidia is doing anything to improve the performance of their card instead of just adding 3d vision, AO, and sli profiles.
November 8, 2012 6:45:55 AM

can we get some quadfire benchmarks too? :D 
November 8, 2012 6:55:12 AM

AMD's Dual GPU at 500+ Watts of electricity is out for me.. Too Much Power and Noise..
November 8, 2012 6:56:33 AM

2 670's in sli is better than spending on a 690 and 2 7950's in Xfire is better than spending on a 7990. this way you save nearly $300 both ways
November 8, 2012 7:57:06 AM

wow, microstuttering is a now a non issue , at least AMD
November 8, 2012 8:02:22 AM

Good read!

But, would have liked to see 680s in SLI, to see how they scale now compared to the 690.

Also, would using two single GPUs in CF/SLI make a difference to the micro-stuttering charts? iirc, the PCIe controller is tied to the CPU for SB/IB chips? So that would mean no 3rd party bridge in between the two GPUs as in the case of the 7990 and 690. Would that make a diff?

How do you manage to isolate the cards' power consumption at load (idle is simpler)? And noise too: how do you block out the case fans and CPU cooler?
Anonymous
November 8, 2012 8:07:52 AM

The radeon pro is saving AMD's butt

But In the end, 690 was slower than 7990 average framerate but with Radeon Pro, it is the 7990 which is slower right?

So yes it's better than without, but the 690 is faster, as smooth, and use a built in technology

AMD really need to work on it's crossfire technology
November 8, 2012 8:08:26 AM

amuffinIs it just me or do the 7970X2 and 7990 coolers look so fast and fugly?


I don't think they look "fast and ugly", although I do think that the HIS model could do with some more finesse.
November 8, 2012 8:17:59 AM

Quote:
How do you manage to isolate the cards' power consumption at load (idle is simpler)? And noise too: how do you block out the case fans and CPU cooler?
The noise was measured with the open benchtable, not in case (no extra case fans and an ultra silent fan on the hidden CPU cooler)

For the power consumption: 3 current clamps with monitoring ;) 
November 8, 2012 8:42:29 AM

Interesting, AMD has a winner at the top tier! That hasn't happened in a while. CODOS to that.
November 8, 2012 8:47:06 AM

if you are spending 1000$ dollars on a video card paying a Power bill is not an issue
November 8, 2012 8:47:14 AM

NovuakeInteresting, AMD has a winner at the top tier! That hasn't happened in a while. CODOS to that.


Technically, HIS has a winner, not AMD because AMD didn't launch a 7990/7970X2 reference;)
November 8, 2012 8:54:49 AM

twinshadowif you are spending 1000$ dollars on a video card paying a Power bill is not an issue


Actually, the only person who I ever recommended a GTX 690 to wanted it specifically because of its low power consumption literally being enough to pay for itself compared to his previous graphics setup due to his high cost for power. Some people looking for such high end cards most certainly do care about power consumption.
November 8, 2012 8:56:38 AM

1 kW/h in Germany: 0.25 Euro (approx. 0.34 USD)
This IS an issue. ;) 
November 8, 2012 9:04:57 AM

The GTX 690 is the clear winner in my eyes, especially since there is a two-slot water-cooled version.
"Just Because You're Fastest Doesn't Make You The Best" pretty much says it all.

The Radeons make huge concessions for the sake of performance:

1. Bigger size. Three slots vs two. Quad Crossfire with two cards becomes virtually infeasible.
2. HUGE power draw: Equals more heat, hence more cooling necessary, hence bigger size.
Exceeding PCI-E specs is very worrisome.
I think TWO GTX 690s would consume about the same or maybe even less power.
3. LOUD. +Coil whine which is even more annoying than just loud.
4. LOTS of microstuttering (virtually unplayable without using third-party software).
5. Price. Let's be real. $1300 is optimistic, and availability is a shot in the dark.

Pros:

1. More FPS. Doesn't matter though unless you're using multiple displays, but that comes with the HUGE downside of giant bezels in your face.
2. Little to no microstuttering with third-party software. The only saving grace but doesn't add a whole lot since GTX 690 microstuttering isn't that bad.

Calling these three-slot monstrosities "inelegant" is possibly the nicest thing you could say.
November 8, 2012 9:23:29 AM

merikafyeahThe GTX 690 is the clear winner in my eyes, especially since there is a two-slot water-cooled version."Just Because You're Fastest Doesn't Make You The Best" pretty much says it all.The Radeons make huge concessions for the sake of performance:1. Bigger size. Three slots vs two. Quad Crossfire with two cards becomes virtually infeasible.2. HUGE power draw: Equals more heat, hence more cooling necessary, hence bigger size.Exceeding PCI-E specs is very worrisome.I think TWO GTX 690s would consume about the same or maybe even less power.3. LOUD. +Coil whine which is even more annoying than just loud.4. LOTS of microstuttering (virtually unplayable without using third-party software).5. Price. Let's be real. $1300 is optimistic, and availability is a shot in the dark.Pros:1. More FPS. Doesn't matter though unless you're using multiple displays, but that comes with the HUGE downside of giant bezels in your face.2. Little to no microstuttering with third-party software. The only saving grace but doesn't add a whole lot since GTX 690 microstuttering isn't that bad.Calling these three-slot monstrosities "inelegant" is possibly the nicest thing you could say.


http://www.newegg.com/Product/ProductList.aspx?Submit=E...

PowerColor has two 7990s, one is going for $1000 and another for $900. Where are you getting this $1300 number from? Sure, availability is poor, but the pricing is not.

Two GTX 690s consume a good deal more power than a single 7990. Yes, the 7990's power consumption is far too high, but lets leave exaggeration out of it.

Why is exceeding PCIe specs that worrisome? The cables are more than capable of handling it, it's fine.

Quad Crossfire is easy. Simply get a system with eight expansion slots such as the Gigabyte G1.Sniper 3 with a case that has eight expansion slots too (very common among higher end cases) and you'd have a full two slots for air flow between the top and bottom card, that's plenty. Heck, even one slot of airflow with a much cheaper board and case would probably be just fine. What I'd be more worried about is getting a PSU that can handle the load and the ridiculous power bill entailed.

Tom's only said that the Power Cooler model had bad coil whine, not the HIS model.
Anonymous
November 8, 2012 10:32:30 AM

Eh...not a fair comparison, IMO. The 2gb memory on the 690 is quite limiting. And these are specialty run AMD cards. They'd be more comparable to an evga gtx 680 classified 4gb (doesn't come in 690 variety). Simply because nvidia has not pushed the 690's very hard.

Furthermore...the limited 256 bit bus on the 690 causes some bandwidth limitation issues. That's why I returned my 2 gtx 690's and went with quad gtx 680 classified 4gb. Even then I have the gpu's running at 1300mhz, and the memory at 7300mhz up from 6000mhz.

Vsync also adds input lag. I'd like to know about the radeon app used which was supposed to lower micro stutter. Because if it lowers micro stutter at the cost of increased input lag, it's still not worth it for the hardcore gamers that would be getting these kinds of cards. Substantially better power usage and nearly no micro stutter is why I got my cards.

And with a 120hz 1440p monitor from 120hz.net I really don't need to use vsync.
November 8, 2012 10:40:46 AM

HyperMatrixEh...not a fair comparison, IMO. The 2gb memory on the 690 is quite limiting. And these are specialty run AMD cards. They'd be more comparable to an evga gtx 680 classified 4gb (doesn't come in 690 variety). Simply because nvidia has not pushed the 690's very hard. Furthermore...the limited 256 bit bus on the 690 causes some bandwidth limitation issues. That's why I returned my 2 gtx 690's and went with quad gtx 680 classified 4gb. Even then I have the gpu's running at 1300mhz, and the memory at 7300mhz up from 6000mhz. Vsync also adds input lag. I'd like to know about the radeon app used which was supposed to lower micro stutter. Because if it lowers micro stutter at the cost of increased input lag, it's still not worth it for the hardcore gamers that would be getting these kinds of cards. Substantially better power usage and nearly no micro stutter is why I got my cards.And with a 120hz 1440p monitor from 120hz.net I really don't need to use vsync.


2GB isn't limiting much at all per GPU right now... Even at triple 1080p or triple 1920x1200, there are only a handful of situations where 2GB per GPU becomes a problem and even then, simply using a less memory capacity-reliant setting and more GPU-reliant setting solves that issue just fine. Unless you have something like triple 2560x1440, 2GB is rarely a bottle-neck. At only one 2560x1440 display, I'm not aware of any game that is bottle-necked by 2GB of frame buffer capacity, especially when you're going for high frame rates for a 120Hz display.

Sure, they're memory bus is a limiting factor, but if you cared about that, you could have simply gotten 7970s instead...
November 8, 2012 10:50:32 AM

Drooling like a huge dog :o 

I would love to have any out of those three cards T.T
November 8, 2012 11:57:03 AM

i dont care personally i think that his 7990 is pretty sexy ..
as for the 690 .. ill pass
November 8, 2012 12:16:43 PM

I would not turn down a 7990. Its an impressive card.

But I wonder how it would fare against 2 7970 GHZ cards considering these cards are no more then 2 7970 one board.
November 8, 2012 1:00:09 PM

The whole "uses less power" thing on high end video cards to me is a joke. We are talking the difference of about 1 dollar a month is savings. Not worth my time to even think about.
Anonymous
November 8, 2012 1:00:47 PM

The GTX looks a lot better IMO. Also performs a bit faster than the 7990 while running cooler, and it overclocks like a monster. I got it for a pretty good price from http://computer-radar.com/shop.php?c=all&n=none&i=B0085... since Newegg was out of stock at the time.
November 8, 2012 1:28:15 PM

FormatC1 kW/h in Germany: 0.25 Euro (approx. 0.34 USD)This IS an issue.


1 kW/h in Iceland: 0.05 Euro (approx. 0.06 USD)
This is a NON issue ;) 
November 8, 2012 1:35:08 PM

I want one
November 8, 2012 1:48:29 PM

Dark OOpaThe radeon pro is saving AMD's buttBut In the end, 690 was slower than 7990 average framerate but with Radeon Pro, it is the 7990 which is slower right? So yes it's better than without, but the 690 is faster, as smooth, and use a built in technologyAMD really need to work on it's crossfire technology


I am assuming you mean the architecture by Radeon pro? It is called Tahiti Pro.. And if Xfire was an issue, we would see very bad micro-stuttering. Its all about the games support for Xfire. Not the other way around.
November 8, 2012 1:51:24 PM

merikafyeahThe GTX 690 is the clear winner in my eyes, especially since there is a two-slot water-cooled version."Just Because You're Fastest Doesn't Make You The Best" pretty much says it all.The Radeons make huge concessions for the sake of performance:1. Bigger size. Three slots vs two. Quad Crossfire with two cards becomes virtually infeasible.2. HUGE power draw: Equals more heat, hence more cooling necessary, hence bigger size.Exceeding PCI-E specs is very worrisome.I think TWO GTX 690s would consume about the same or maybe even less power.3. LOUD. +Coil whine which is even more annoying than just loud.4. LOTS of microstuttering (virtually unplayable without using third-party software).5. Price. Let's be real. $1300 is optimistic, and availability is a shot in the dark.Pros:1. More FPS. Doesn't matter though unless you're using multiple displays, but that comes with the HUGE downside of giant bezels in your face.2. Little to no microstuttering with third-party software. The only saving grace but doesn't add a whole lot since GTX 690 microstuttering isn't that bad.Calling these three-slot monstrosities "inelegant" is possibly the nicest thing you could say.


You neglect to look back at the history of Duel Graphics cards, both these cards are huge leaps forward in multiple GPU's on one PCB.

Keep in mind the goal of these are BRUTE force, not elegance. I would not buy one of these but saying either is bad is idiotic.
November 8, 2012 1:57:46 PM

If you are that passionate to bench aftermarket dual gpu cards with a dual kepler,just wait for the ASUS MARS !
November 8, 2012 1:57:53 PM

FormatC1 kW/h in Germany: 0.25 Euro (approx. 0.34 USD)This IS an issue.


1kW/h in Boston,MA $0.06718/kWh (0.04411 Euro)= this is NOT an issue :) 
November 8, 2012 2:14:28 PM

blazorthonTechnically, HIS has a winner, not AMD because AMD didn't launch a 7990/7970X2 reference


Fair... LOL! Shame AMD didn't do it, maybe this will press them, but I doubt.
November 8, 2012 2:17:52 PM

Anik8If you are that passionate to bench aftermarket dual gpu cards with a dual kepler,just wait for the ASUS MARS !


It won;t ever release... The whole reason ASUS does it is because they do not underclock the each GPU as with normal Duel GPU's. THese reference GTX690 is not underclocked anyway, so what would be the pint of a new Asus ROG MARS??? They could just call it that. But it would not be special in any way except for the cooler. It would actually be a bad idea because its a Three slot, really long card.
November 8, 2012 2:39:36 PM

Some call me an AMD fanboy but I would have liked to seen Nvidia better represented with both 670 SLI and 680 SLI. From what I've seen they are a better choice for a better price.

Running the benchmarks with:
HiS 7990 $1000 (actual price unknown)
PowerColor 7990 $1000 (if you can find one)
GTX 690 $1000
7970 Ghz Edi XFire $820
7970 Xfire $780
GTX 680 SLI $914
GTX 670 SLI $720

(Prices are off of US PCPartPicker not using rebates and in stock items only)

Also with AMD 12.11 vs. Nvidia 310.33. Let's be fair on the drivers and representation.
November 8, 2012 2:47:31 PM

This review was done 4 weeks ago ;) 

If you can sleep better now:
The 310.33 does absolutely nothing. This has no effect on the results of this review :) 
November 8, 2012 2:58:39 PM

STOP USING AVP! Its to freaking old and nvidia and AMD no longer optimize it. COMON!
November 8, 2012 3:04:22 PM

gsxrme said:
STOP USING AVP! Its to freaking old and nvidia and AMD no longer optimize it. COMON!
This is not right. The Catalyst 12.11 beta is a little bit optimized :) 
But you are right, we will kick this in 2013.
November 8, 2012 3:10:06 PM

Overall, great article. I do feel that words like 'hot', 'big', and 'squeal' have been "hammered" into my brain... u like what I did there? haha

My upgrading probably doesn't become warranted until at least 2014. I think I'll still be sporting a 27" 1080p monitor too, due to a lack of room for anything larger, or just more of them. I don't think a 3D Vision Ready 2560x1600 is even technically possible, but that's what I'd want to build around if it were, cuz I could squeeze in a 30" monitor, but just barely.
November 8, 2012 4:00:42 PM

Great article. Love seeing these top-end cards and what they can do (wish they would have thrown in dual-670s for comparison though). ALMOST makes me want one, but then I remember my little GTX 670 can handle my 5760x1080 setup just fine and say: $600 could get me what else? :-)

PS- 1 kwh in NY ~$.06, but I'd still rather be green than not. ^^
November 8, 2012 4:23:37 PM

maxinexus said:
1kW/h in Boston,MA $0.06718/kWh (0.04411 Euro)= this is NOT an issue for me :) 

There, I fixed it for you. Just because energy prices aren't relevant to you doesn't mean they're irrelevant to everyone.
November 8, 2012 4:45:22 PM

What's with the gigantic coolers on these video cards? Do they really need that much cooling? Why don't they just go liquid?
November 8, 2012 5:10:42 PM

I always say "build to suit your apps". This is a clear case where people should do just that.

While I play more than just one game, there is typically one that takes up the majority of my time, and that's what I build for. So, for me, looking at an average or overall result isn't all that important. If World of Warcraft, my former main game, runs better on the GTX 690, then clearly it makes no sense for me to buy a Radeon 7990. Nvidia cards have always performed better running WoW, especially SLI vs Crossfire.

And - what about PhysX?
November 8, 2012 5:15:35 PM

in first picture .... is that JORDANIAN crown ?
btw FTW GTX690
November 8, 2012 5:32:29 PM

blazorthon2GB isn't limiting much at all per GPU right now... Even at triple 1080p or triple 1920x1200, there are only a handful of situations where 2GB per GPU becomes a problem and even then, simply using a less memory capacity-reliant setting and more GPU-reliant setting solves that issue just fine.

This is true, I'm not arguing with you. But when you spend $1000 on a card that has a vram bottleneck on ultra at 4xMSAA in BF3 multiplayer, you start to wonder why you didn't just get 2 x 680 4gb (and note how much the radeons are trying to use at those settings, yikes!). Regardless of that bottleneck, however, the 690 is still better for BF3 multi with triple monitors: http://hardocp.com/article/2012/03/28/nvidia_kepler_gef...
!