Sign in with
Sign up | Sign in
Your question
Closed

Sapphire Toxic HD 7970 GHz Edition Review: Gaming On 6 GB Of GDDR5

Last response: in Reviews comments
Share
September 4, 2012 4:55:11 AM

The 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.
Score
32
September 4, 2012 5:10:38 AM

i think this would perform much better with a trifire.if one 7970 reference can handle 3 screens than 3 of these could easily eat 6 screen,in my op
YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.

Score
6
Related resources
September 4, 2012 5:20:01 AM

YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.


Seeing as in both SLI and CFX memory contents are copied to each card, you would practically need that much for ridiculously large screen playing. One card can not handle multiple screens as this was designed for, you need at least two for a x4 screen and three for a x6 screen. The golden rule seems to be two screens per high end card.
Score
8
September 4, 2012 5:44:07 AM

YoungmindThe 6gb of memory might not have much of an effect with only a single card, but I wonder if it will have a larger impact if you use in configurations with more graphics cards such as tri-crossfire and quad-crossfire? If people are willing to spend so much money on monitors, I think they'd be willing to spend a lot of money on tri/quad graphics card configurations.


This.

BigMack70Would be very interested in seeing this in crossfire at crazy resolutions compared to a pair of 3GB cards in crossfire to see if the vram helps in that case


And this.

Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.

So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.
Score
14
September 4, 2012 6:19:57 AM

Quote:
Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire.
Sapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears ;) 
Score
15
September 4, 2012 6:23:04 AM

tpi2007This.And this.Tom's Hardware, if you are going to be reviewing a graphics card with 6 GB of VRAM you have to review at least two of them in Crossfire. VRAM is not cumulative, so using two regular HD 7970 3 GB in Crossfire still means that you only have a 3 GB framebuffer, so for high resolutions with multiple monitors, 6 GB might make the difference.So, are we going to get an update to this review ? As it is it is useless. Make a review with at least two of those cards with three 30" 1600p monitors. That is the kind of setup someone considering buying one of those cards will have. And that person won't buy just one card. Those cards with 6 GB of VRAM were made to be used at least in pairs. I'm surprised Sapphire didn't tell you guys that in the first place. In any case, you should have figured it out.

Why not go to the uber-extreme and have crossfire X (4gpus) with six 2500X1600 monitors and crank up the AA to 4x super sampling to prove once and for all in stone.
Score
3
September 4, 2012 6:35:52 AM

The normal 7970s seem much better than the ghz edition.
Score
-1
September 4, 2012 6:55:50 AM

FormatCSapphire was unfortunately not able to send two cards. That's annoying, but not our problem. And: two of these are cards are deadly for my ears


Thanks for the review. The noise demo alone helps in making a purchase decission.
No sale !

Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?
Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.

Or are there cards like that ?


Score
2
September 4, 2012 7:06:32 AM

freggoThanks for the review. The noise demo alone helps in making a purchase decission.No sale !Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over.Or are there cards like that ?


I think that has been applied to laptops, but not on the desktop scene. One of the reasons why I would think its not as useful on a desktop scene is even if your build has stuff off, the PSU is the least efficient when on near 0% load, so no matter what, your still going to burn electricity just by having the computer on. All gpus nowandays have downclocking features when its not being on load(my 7850 downclocks to 300mhz on idle) but I wouldnt think cards will go full out 0.
Score
3
September 4, 2012 7:27:45 AM

Nice review. However, most of us would have been able to work out the benchmarks in our heads - we've all seen similar reviews and understand that, beyond a minimum, more memory in a single card setup makes little/no difference. The company is trying to lure us into a multi-card setup - hoping that the memory benifits there override/mask out the obvious significant noise issue.

If these companies - or ourselves - can tackle the noise then such card's traget senario would be realised. Of course, even here, for the rest of us mere mortals we still have one more significant 'hurdle'........cost......so we'll keep waiting.
Score
0
September 4, 2012 9:36:32 AM

far too much money, would rather buy a TV with that!
Score
2
September 4, 2012 10:25:36 AM

for a single monitor,this is waste! 3GB is enough.
Score
0
September 4, 2012 11:31:51 AM

This did not feel like an enthusiast review.

I realize it is expensive but the review needed another tox for CF and some proper monitors.

Lack luster to say the least. Nothing to say wow about here, I have a feeling 2 or 3 these in cf on water or phase change is something I will have to hunt down myself on the internet
Score
3
September 4, 2012 11:48:52 AM

Would like to see a comparison at 3 monitors against a Galaxy GeForce GTX 680 SOC White Edition ...

You know since an even number of monitors means youre looking at a lovely seam running right down the middle of your view.
Score
0
September 4, 2012 2:05:48 PM

"All of that makes Sapphire's Toxic HD 7970 GHz Edition an answer in search of a problem. We can’t think of a usage scenario for which we’d recommend it. If you really dig the effort Sapphire put into its Vapor-X cooling solution, we recommend you check out the Vapor-X HD 7970 GHz Edition 3 GB card, and use the difference to take your better half out to a nice dinner."

Really? I can think Crossfire and use 6x30" monitors. If you're getting an expensive set up, why not go all the way? This is the Beyron of setups after all!

And yes, it was a very unfair conclusion, since regular people is no the target customer of this kind of card. I thought Toms had more enthusiast blood.

Cheers!
Score
3
September 4, 2012 2:22:49 PM

I heard the same thing said about 1GB and 2GB. Hang tight for a year or so and 6GB will be there. Also high resolution screens like the retina displays will demand much much more from our graphics cards, as they become more popular. Only just now has IPS panels really start becoming in demand for computer monitors thanks to Ipad, and next will be the retina displays.
Score
0
September 4, 2012 2:24:05 PM

Why not just hook it up to a 1080P projector 0.o ... or THREE 1080P projectors?!!!! that would be awesome!
Score
1
September 4, 2012 2:29:48 PM

How effective would the 6GB of video RAM be for Microsoft's RemoteFX (giving multiple users -- single or possibly dual displays)?
Score
0
September 4, 2012 3:06:03 PM

Who the heck would run at that rez in Eyefinity 6? Why not 5760x2160?
Score
-1
September 4, 2012 7:12:05 PM

I had trouble taking the results in when I saw they were done on the Catalyst 12.6 drivers. Considering how much better 12.7/12.8 was.
Score
0
September 4, 2012 10:15:46 PM

But can it play Crysis?
Score
-1
September 5, 2012 1:16:01 AM

This is a monster!
Score
0
September 5, 2012 2:16:21 AM

TLDR: Tom's haven't tested the VRAM limit correctly, and Sapphire should be ashamed for only provided one sample for testing. To test the extreme requires an extreme system build, not just an extreme monitor arrangement.

Maximising PC performance is all about alleviating the largest bottleneck, and then moving onto the next.

VRAM increases will not perform some dark magic. To really test 3GB vs 6GB, there is one fundamental bottleneck you need to remove: GPU processing capacity. One GPU, even in this generation, is simply not designed for processing *maximum-detail* graphics over a 3 or more 1080p+ monitor resolution space. Now, that premise need also be tested, but unless you're putting 2-4 of these graphics cards in crossfire, you can't guarantee the avoidance of this (most probable) bottleneck.

Once the fundamental bottleneck is reduced, you may discover secondary bottlenecks (before VRAM limits). Since you can now start to think about maximising resolution space to fill up the performance gained by adding 1-3 more graphics cards, you may find that CPU and/or memory subsystems become the next highest bottleneck(s), so go ahead and maximise these too, say a hexacore hyperthreaded SB-E O/C with 16GB of 2000+ MHz DDR3.

Only now can you consider 3GB of VRAM as a potential bottleneck.

Then in the back of your mind should come the concern that addressing double the VRAM over the same memory bus width introduces some overhead in referencing and accessing the additional memory space. This overhead is increasingly significant if you haven't removed the other bottlenecks first, and in extreme cases could actually lower performance if a game is not utilising 3GB of VRAM.
Score
1
September 5, 2012 2:19:06 AM

What SHOULD have been compared:

3x (HD7970 3GB) + 3x MONITORS, vs
3x (HD7970 6GB) + 3x MONITORS

Resolution for the monitors should have been both 1920x1080, and 2560x1600.

As mentioned, a triple monitor setup is the most logical use for most gamers and for that you need multiple cards.

Microstutter issues are also mitigated by using three instead of two cards.
Score
3
September 5, 2012 3:07:34 AM

One thing that's got me curious...
1)Heat rises so why isn't the graphics processor on top of the card?
Score
-1
September 5, 2012 4:27:20 AM

This was a very half-hearted review. A graphics card with this much memory is obviously aimed at multi-card setups, which is exactly what you didn't test. Get back to me when your ready to write a real review.
Score
3
September 5, 2012 9:24:14 AM

I find a review supposedly designed to test the effectiveness of GPU VRAM quantity that doesn't even tell us how much VRAM was put to use to be a bit lacking.

The setup looks good, I congratulate Tomshardware in actually coming up with something that might tell us something about VRAM, but what we don't know is what the real power limit is that's causing the low frame rates here, and without even knowing how much or why more VRAM wasn't used where it maybe could have been, we are left not much more enlightened than before.
Score
0
September 5, 2012 11:00:28 AM

You people SERIOUSLY had to do this test on a NON PCIe 3.0 rig..!
Makes me kinda have a laugh..!

And most people seem to have missed out on noticing it..PITY!!!
Score
0
September 5, 2012 4:35:18 PM

Thanks for pointing that out shadyinc. I only skimmed the article, so entirely missed that.

Sad.
Score
0
September 5, 2012 4:59:52 PM

The card does not have 6 video ports, only 3. Looks like the vendor is strongly suggesting that their design is for 3 monitors per card. Seems a bit unfair.
Score
-1
September 5, 2012 7:47:58 PM

seems like overkill, At 2x 1920x1080, on a 3GB card, I have trouble getting even 2GB to be used (and in those cases, it requires forced AA and AF settings which makes the game lag on high end cards. (when you force quality settings past what the game ending is designed for, you get rapidly diminishing returns)

if someone were to make a game that would require 6GB of RAM (eg using a ton of 8K textures, full tessellation on all objects and environments) and full mesh counts for all objects and not the standard practice of using a simple sphere or square and apply some bump mapping for objects in the distance)

then I believe that this current 7970 card wont even be able to run the game in a playable manner.

We will eventually get to where 3+ GB is the norm but at the moment, it is not (3GB is a realistic upper limit especially for 2560x displays but we will need a ton more GPU power to run games that would require 6GB of RAM)
Score
0
September 5, 2012 7:57:54 PM

PS, I wonder, why cant they find a way to make use of unused video memory, eg if your card has 2-3 GB of memory and you never use more than 1GB, then allow users to use the other 1-2GB GB as a RAM disk.

CUDA and various other technologies allow you to use the GPU for many other things, why not allow users to have some kind of driver that allows them to reserve a portion of the video memory as a RAM disk or possibly as extra system memory?
Score
0
September 6, 2012 6:05:43 AM

this looks awesome :) 
Score
0
September 6, 2012 10:45:49 AM

One word: Superfluous
Score
0
September 6, 2012 4:40:21 PM

The strength of this article is in your analysis of >1100mhz performance, power and noise. As always, well done.

The weakness is in your eyefinity analysis and it is so weak I can't believe you bothered including it. Right from the start you set out to test 6gb vs 3gb while only using a single card. A single card won't game on six screens. The single GPU will bottleneck it so bad there is no point to a vram comparison unless you put together a more realistic 2x or 3x CrossFire setup. This is obvious to any GPU enthusiast.

Your setup (and that odd use of a DVI splitter) is ridiculous. Six screens (3 wide, stacked) isn't ideal for gaming because instead of staring straight ahead and seeing your target reticle (and your enemy in your sights) you instead see your horizontal bezels. 6x1 has the same problem (but with vertical bezels) and is way too wide. Did you not see that when testing Metro and Crysis? At this point you should have given up testing for a vram bottleneck and given us a decent three-screen analysis.

But you went to four screens. Same problem... Double facepalm!

My point is that you did a good review of the card but a useless review of eyefinity. The results of the eyefinity testing show nothing, they don't even give a person an idea of what to expect when trying to eyefinity game with a single card. You never even tried to get playable frames out of a more realistic triple-monitor setup. And seriously - you guys had to go and have a geek in a computer store help you set up eyefinity? What's with the dual-link DVI to 2xDVI splitter? Four of your monitors were hooked via miniDP adapters, that leaves the two DVI ports available to use for the other two monitors... I mean jeez.
Score
1
September 8, 2012 8:28:53 PM

freggo said:
Anyone know why no card has been designed to be turned OFF ( 0 Watts !) when idle, and the system switching to internal graphics for just desktop stuff or simple tasks?
Then applications like Photoshop, Premiere or the ever popular Crisis could 'wake up' the card and have the system switch over. Or are there cards like that ?

Look up what LucidLogix does as a company... I forgot the name of this technology is in particular, but I remember a video where they showed that they can really power off a graphics card and even remove it from the system without rebooting.

The thing though is, the monitor was plugged to the mo-bo, which makes sense if this is true. I would think a discrete graphics card would still be responsible one way or another for rendering even your Windows desktop, one point that makes me think this is the fact that your monitor is plugged into your discrete card. Putting it plainly, how would the screen image get from your system to your monitor if the card it's plugged into is off (rhetorical question)? :-)

The AMD GCN architecture HD 7000-series cards, AFAIK, have this feature where they save even more electricity when your monitor's off, but maybe they can't totally power it off since for one thing, it has to detect when the monitor comes back on, but also be sure to be up and ready when you do decide to turn on your monitor and maybe handle some other necessary processing tasks. What Lucid's technology does is switch tasks to the internal graphics, like you mentioned, I think. There might be some other technicalities (limitations like drivers and just how things work) that prevent powering off the discrete card when not necessary, and maybe that's where Lucid's "magic" comes in. Hehe...

I hope I was able to provide a lead for you. :-)
Score
0
September 9, 2012 6:32:34 AM

Hmmm I had a DOS program back in the day that let me set up a RAMDISK in my unused ram on a 512K EGA Wonder card, would be great if someone came up with a program to utilize the GDDR5 as 3GB ramdisk pagefile. Otherwise, really its all about the E-peen. I mean come on 60 decibels is wack.

Fungi
Score
0
September 9, 2012 8:27:48 AM

funguseaterHmmm I had a DOS program back in the day that let me set up a RAMDISK in my unused ram on a 512K EGA Wonder card, would be great if someone came up with a program to utilize the GDDR5 as 3GB ramdisk pagefile.

OpenCL or other GPGPU API may actually be able to do that, but I don't really know. One thing about that is that I heard GDDR RAM has a high clock rate but also a very high latency. I think this may be bad for normal data compared to the graphics data that GPU's handle. Just speculation based on the fact why GDDR isn't used as normal system RAM. :-)
Score
0
Anonymous
September 10, 2012 7:01:04 PM

This things exhaust looks like the back engine view of a star destroyer
Score
0
Anonymous
September 12, 2012 9:32:30 PM

If you didn't like the OC fan profile/noise (too aggressive), why didn't you just use TRIXX to set your clocks & fan profile the way you wanted? It does that really well.
Score
0
September 13, 2012 11:12:45 PM

want.... this..... now......
Score
0
Anonymous
October 5, 2012 11:19:37 AM

"we recommend you check out the Vapor-X HD 7970 GHz Edition 3 GB card, and use the difference to take your better half out to a nice dinner."

I did, and found the scenario you were looking for: the 3GB version only support 3-monitor Eyefinity setups (unless you throw an expensive DP splitter in the mix). With the 6 GB version, you get 4-monitor support "almost" out of the box: you also get an active mini-DP to DVI converter bundled with the card, wich means you only need to add another one (about 10 EUR) to drive 4 "cheap" DVI monitors.

The splitters cost more than the price difference between the 3 and 6 GB versions, and you get 3 GB of extra VRAM in the deal.
Score
0
Anonymous
October 24, 2012 4:34:26 AM

"as Sapphire needed its card back". wow i always assumed the let you have them for as long as you needed them. i dont get how 1 or 2 cards make a difference to the manufacturer. pretty lame.
Score
-1
November 14, 2012 3:39:27 AM

LOVE this card but not sure whether to get this, MSI's, or ASUS's. I will keep learning.
Score
0
!