Sign in with
Sign up | Sign in
Your question
Closed

Which will perform better?

Last response: in Graphics & Displays
Share
September 23, 2011 5:35:01 PM

Which can I expect to perform better?

Two (2) of these:
XFX HD-677X-ZNFC Radeon HD 6770 1GB 128-bit GDDR5 PCI Express 2.1 x16 HDCP Ready CrossFireX Support Video Card
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

Or, one (1) of these:
XFX HD-687X-CNFC Radeon HD 6870 2GB 256-bit GDDR5 PCI Express 2.1 x16 HDCP Ready CrossFireX Support Video Card
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

My system has these three monitors that I would like to use for an Eyefinity gaming experience:

Asus ML228H 21.5" 1920X1080 2ms Full HD LED BackLight LED Monitor Slim Design 250 cd/m2 10,000,000 :1 (ASCR)
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

APPROXIMATE PURCHASE DATE: Not applicable.
BUDGET RANGE: As little as possible, obviously.
USAGE FROM MOST TO LEAST IMPORTANT: Gaming (Sword of the Stars II, Unreal Tournament), Office, Internet
CURRENT GPU AND POWER SUPPLY: Not applicable (new system)
OTHER RELEVANT SYSTEM SPECS: AMD CPU, probably ASUS or Gigabyte motherboard
PREFERRED WEBSITE(S) FOR PARTS: Newegg.com
PARTS PREFERENCES: AMD or ATI
OVERCLOCKING: No
SLI OR CROSSFIRE: Maybe (What do you think?)
MONITOR RESOLUTION: 1920x1080 (three monitors)

More about : perform

a c 1401 U Graphics card
September 23, 2011 6:28:50 PM

When crossfire is supported the 2x6770 will be more powerful.
Score
0
Related resources
September 23, 2011 8:37:47 PM

This is just an example. I'm trying to understand how to properly select a graphics card for my system. As far as performance is concerned, it looks like using 2 cards with half the processing power will generally win out over a single card with twice the power.
Score
0
a c 1401 U Graphics card
September 23, 2011 8:57:27 PM

pcpro178 said:
This is just an example. I'm trying to understand how to properly select a graphics card for my system. As far as performance is concerned, it looks like using 2 cards with half the processing power will generally win out over a single card with twice the power.

Incorrect having two cards does not double the power, 90+% increase is good. A single card that is truly double the power will win over the dual card solution. You should be comparing the 2x6770 with HD6950
Score
0
September 23, 2011 9:18:35 PM

rolli59 said:
Incorrect having two cards does not double the power, 90+% increase is good. A single card that is truly double the power will win over the dual card solution. You should be comparing the 2x6770 with HD6950


Granted, but does that hold true if I want to spend no more than a given amount on graphics hardware? For example, let's say that I have $500 to spend. Would I be better off buying two $250 cards with Crossfire, or a single $500 card? For power for buck, it seems like the two-card option is better.

What it really comes down it is this. I want to drive my three monitors for an Eyefinity gaming experience. My monitors have a high resolution of 1920x1080. I do not want to spend $7000 on a graphics card. I want to get the best bang for my buck.

This is the card I'm looking at right now:

XFX HD-687X-CNFC Radeon HD 6870 2GB 256-bit GDDR5 PCI Express 2.1 x16 HDCP Ready CrossFireX Support Video Card
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

Would I need two of these, or would one be sufficient?
Score
0
a c 1401 U Graphics card
September 23, 2011 9:26:04 PM

For eyefinity which makes the resolution bigger I would recommend minimum 2 x HD6850 in Crossfire so to answer your question you will need 2 cards to enjoy gaming with 3 monitor eyefinity. Then again I would recommend as here 2 x HD6950 2GB for gaming in eyefinity http://www.tomshardware.com/reviews/gaming-performance-...
Score
0
a c 376 U Graphics card
September 23, 2011 9:26:55 PM

The least you should consider for Eyefinity gaming at that resolution is a 2gb HD 6950. If you can afford it 2 for crossfire would certainly help.
Score
0
September 23, 2011 9:36:37 PM

I would strongly recommend two 6790's for gaming. They will perform considerably better than two 6770's. At newegg.com they are selling for $99 right now. I would choose that over any of your choices.
Take the cards quick before the rebate (if you choose to use it) goes away. Even if it does, it will only cost you $4 more than the 6770's. With free shipping it will be $7 cheaper for both.



http://www.newegg.com/Product/Product.aspx?Item=N82E168...
Score
0
a b U Graphics card
September 23, 2011 10:03:43 PM

Keep in mind 6770's are only 128bit, not 256bit.

I would still take a single 6950 2gb over any of the choices mentioned.
Score
0
a c 376 U Graphics card
September 24, 2011 12:46:35 AM

Friznutz said:
I would strongly recommend two 6790's for gaming. They will perform considerably better than two 6770's. At newegg.com they are selling for $99 right now. I would choose that over any of your choices.
Take the cards quick before the rebate (if you choose to use it) goes away. Even if it does, it will only cost you $4 more than the 6770's. With free shipping it will be $7 cheaper for both.

The 1gb of memory on the HD6790 is inappropriate for the large resolution(5760x1080) the OP will using. Also the rebate on those cards is limited to one per person.
Score
0
September 24, 2011 1:04:02 AM

You make no sense.
There will be the same amount of VRAM all together. And 1 screen will get a full 1gb or VRAM. If you get a 2gb each one will get about 656ghz each monitor. And who cares about the rebate that much. It will only cost you $4 more than a standard 6770. You really must read my post before saying the stuff you say.

Here it is if you still dont understand.

Monitor 1-main gaming monitor which will display center view- supply by the 1 6790 (for $120) full 1g vram being pushed to that monitor....
Monitor 2+3- Side monitors both using 1 6790 each receiving 512mb vram ($99). Those monitors will most likely not be displaying the UI and you will most likely not focus stuff on them.

2 6790's will have a combined 1600 stream processes. With 2x840mhz combined GPU speed. (monitor 1 will get a full 800 stream processes and 840mhz core speed with 1gb vram and monitor 2+3 will get 400 stream processes each, 420mhz core speed and have 512mb of VRAM each (this is all theoretically separated ratio)

if you have a 6950 you get 1408 stream processes. With only 1* 870mhz core speed (each monitor will get (theoretically) 469 steam processes and have to share 2gb of VRAM (666mb each)


Therefore making you main display, which will most likely display most of the game data, have the most amount of power.
6790 (maybe lag in 2+3 screens)
6950 (if there is lag, will be on all screens)
Score
0
a c 105 U Graphics card
September 24, 2011 1:36:58 AM

for eye-finity you'll need a good card with 2gig of memory..... the 6950 minimum. put 2 cards that have 1gig of memory does NOT = 2gigs and will be worthless. This is big card territory. I have my doubts about that 68 series you posted. Different architecture than the 6900 series. Cross-firing 2 2gig6900 series cards should be the ticket if you can afford it.
Score
0
September 24, 2011 2:23:50 AM

http://forums.anandtech.com/showthread.php?t=2157093

In every review 6790 Crossfire is better than a 6950 1gb and 2gb. And why would you say cross-firing two 6900's. Why dont you read what he says again and not even recommend things that he did not ask for.


EX. If you ask a car dealer for a Honda Accord and he says "No. What you want is a Ferrari. Can you afford it?"
I would say "Um.... Are you an idiot? I am here looking for a well priced car that will get me places. Not something I didn't ask for!"
Then I would leave

I have noticed a ton of stupid people say that kind of stuff. Like ex.
"I am looking for a $100 video card that can run games."
Answer-"If you can get like a. 6990 get that. It runs all games at max settings."
I have seen stuff like that and it is extremely stupid
Score
0
a c 105 U Graphics card
September 24, 2011 2:26:35 AM

your immaturity and lack of knowledge about computer hardware is showing through.
Score
0
a c 164 U Graphics card
September 24, 2011 2:29:04 AM

My two cents:

1) SLI and Crossfire. STAY AWAY! There are micro-stutter issues. Read the recent Tomshardware article. There are other issues as well.

2) Three monitors:
Read the details of hooking them up carefully to ensure your monitors have the required inputs. I'm not sure if you still require a DisplayPort to get the third monitor working.

3) I'd probably recommend gaming on your main screen but that's your choice and it will depend on if your gaming card can support three screens fast enough. I simply don't like the BEZEL issue, but three is much better than two because with two screens the bezels are right in the middle.

Example of recommended cards:
HD6950 2GB

HMMMM....

I have been investigating whether the internal DUAL GRAPHICS of this card has the same micro-stutter issues as Crossfiring two single-GPU cards and couldn't find any answers. I found crossfire of two HD6950's but that's FOUR GPU's.

Based on your needs I'd have to go with this card. I simply can't find a better solution for the price. If I recommend a similarly-performing single-GPU AMD card then it would require DisplayPort for the third monitor.

to be clear on cables:
1) 4x native, and
2) up to 6x with Displayport

That's a double (two cards integrated basically) of the previous 2xDVI + 1xDisplayport you needed for three monitors.

(NVidia supports only two monitors from any of its single cards but is changing this in Q1 or Q2 of 2012 with its GTX600 series).

Summary:
HD6950 2GB recommended.
Score
0
September 24, 2011 2:39:46 AM

swifty_morgan said:
your immaturity and lack of knowledge about computer hardware is showing through.




And you lack of complete facts and stupid recommendations is absurd.
Yes I would agree 2 6900's would be IDEAL but doesn't me I would tell him that. Because he can only afford half of that.

I may be an asshole but I am not immature.

If I am wrong (which I probably am) please elaborate for future reference.
And please dont start off with "Yes. You are wrong"
Score
0
a c 376 U Graphics card
September 24, 2011 2:54:54 AM

Friznutz said:
You make no sense.
There will be the same amount of VRAM all together. And 1 screen will get a full 1gb or VRAM. If you get a 2gb each one will get about 656ghz each monitor. And who cares about the rebate that much. It will only cost you $4 more than a standard 6770. You really must read my post before saying the stuff you say.

Here it is if you still dont understand.

Monitor 1-main gaming monitor which will display center view- supply by the 1 6790 (for $120) full 1g vram being pushed to that monitor....
Monitor 2+3- Side monitors both using 1 6790 each receiving 512mb vram ($99). Those monitors will most likely not be displaying the UI and you will most likely not focus stuff on them.

2 6790's will have a combined 1600 stream processes. With 2x840mhz combined GPU speed. (monitor 1 will get a full 800 stream processes and 840mhz core speed with 1gb vram and monitor 2+3 will get 400 stream processes each, 420mhz core speed and have 512mb of VRAM each (this is all theoretically separated ratio)

if you have a 6950 you get 1408 stream processes. With only 1* 870mhz core speed (each monitor will get (theoretically) 469 steam processes and have to share 2gb of VRAM (666mb each)


Therefore making you main display, which will most likely display most of the game data, have the most amount of power.
6790 (maybe lag in 2+3 screens)
6950 (if there is lag, will be on all screens)

I'm unsure where you got all this from(did you just make it up?) but basically nothing you've said above is accurate. That is simply not how Crossfire works or how gaming on multiple screens is rendered. In crossfire the multiple GPUs are treated as a unit, each having its own memory which is redundant so that the cards all have the same information available and can properly work together in conjunction. The 3 screens would also be treated as one large scene by the GPU(s). There is no splitting of the cards memory/shader resources as you describe to render each screen separately.
Score
0
a c 164 U Graphics card
September 24, 2011 2:55:49 AM

Multiple Displays:
http://www.hardwarecanucks.com/forum/hardware-canucks-r...

Same article as I linked above. I just wanted to say that if you don't have a Displayport connection you would likely connect like this:

Display #1: DVI->DVI
Display #2: DVI->DVI
Display #3: HDMI->DVI

*DVI-I is a set of wires containing both the wires for DVI (digital) and VGA (analog). A DVI-I->VGA "adapter" is simply connecting to the VGA connections.

The HDMI output is DVI+Audio. The audio is achieved through decoders on the HD6850/70 board and only work with audio formats for movies (no Windows or game sounds). There may be scenarios where this works but I just avoid the entire scenario.

However, I recommend the HDMI->DVI which just passes the video and get your audio straight from the motherboard's audio solution or an addon audio card such as a Creative or Asus card. With good quality speakers (at least $50 stereo or $100 surround) you would see a big difference in quality. M-Audio AV30 or AV40 are really great stereo speakers. You can get more base with a subwoofer setup (2.1) but that also carries throughout the house more by vibrating the floor. If in a basement I'd definitely go with a quality 2.1 setup and inexpensive, but quality addon sound card.
Score
0
a c 376 U Graphics card
September 24, 2011 3:04:24 AM

photonboy said:
1) SLI and Crossfire. STAY AWAY! There are micro-stutter issues. Read the recent Tomshardware article. There are other issues as well.

Do you have actual experience with SLI/Crossfire or did you take something said in an article you read WAY too seriously?
Score
0
September 24, 2011 3:14:31 AM

jyjjy said:
I'm unsure where you got all this from(did you just make it up?) but basically nothing you've said above is accurate. That is simply not how Crossfire works or how gaming on multiple screens is rendered. In crossfire the multiple GPUs are treated as a unit, each having its own memory which is redundant so that the cards all have the same information available and can properly work together in conjunction. The 3 screens would also be treated as one large scene by the GPU(s). There is no splitting of the cards memory/shader resources as you describe to render each screen separately.




So it still completely evens out all data being sent to all displays though both cards?

So if GPU #1 is at %100 load (for now) and GPU #2 is at %50 load. Card #2 would boost up it speed and send data to card #1?

I used to crossfire some older cards and it was not like that.
If GPU #1 was at %100 GPU #2 did nothing about it.
Well. I am sorry if I misunderstood how modern crossfire works and I will admit a strange defeat. (even though this wasn't a debate)

Still in that case, 2 6790's will have more stream processes combined together than 1 6950. Or what he really wants, a 6870.

or do they not share their memory together for some strange reason? therefore making it only 1gb of VRAM usable if all displays are connected to card #1
Score
0
a c 376 U Graphics card
September 24, 2011 3:30:39 AM

Each card has and needs its own memory and own memory bus on its own PCB. Sharing memory through the Crossfire bridge would be slow and unworkable. Crossfire works by having each card render every other frame so they should consistently have approximately equal workloads. This is not new. I'm not sure what you are talking about with "If GPU #1 was at %100 GPU #2 did nothing about it" but perhaps you were looking at a game in which crossfire was not actually supported.
You cannot just add the number of stream processors on two cards in SLI/Crossfire and compare that to a single card. Crossfire does not scale perfectly and how much it does scale will vary from game to game.
Score
0
a b U Graphics card
September 24, 2011 1:57:38 PM

Friznutz said:
http://forums.anandtech.com/showthread.php?t=2157093

In every review 6790 Crossfire is better than a 6950 1gb and 2gb. And why would you say cross-firing two 6900's. Why dont you read what he says again and not even recommend things that he did not ask for.


EX. If you ask a car dealer for a Honda Accord and he says "No. What you want is a Ferrari. Can you afford it?"
I would say "Um.... Are you an idiot? I am here looking for a well priced car that will get me places. Not something I didn't ask for!"
Then I would leave

I have noticed a ton of stupid people say that kind of stuff. Like ex.
"I am looking for a $100 video card that can run games."
Answer-"If you can get like a. 6990 get that. It runs all games at max settings."
I have seen stuff like that and it is extremely stupid


You my friend are not understanding the OP's needs. Your link shows 1920x1080, the OP is running THREE MONITORS IN EYEFINITY. That would be 5760x1080. You would need 2GB cards for that.
Score
0
a c 104 U Graphics card
September 24, 2011 2:39:51 PM

LOL. +1 for Geek. Since we're talking in car terms ; When i go to my car dealer and ask for a car that has a top of 340 km/h but only want to buy a car like a Toyota Prius i hope this dealer knows whats he talking about and tells me to buy the Ferrari because no way this Prius will give me the speed i need. :kaola: 
Score
0
a c 164 U Graphics card
September 25, 2011 6:41:11 PM

Micro-stutter:
http://www.tomshardware.com/reviews/radeon-geforce-stut...

Read this entire last page. In particular:
"Cards like the GeForce GTX 550 Ti seem to be wasted on a SLI setup for this reason alone, achieving decent frame rates in the charts and nasty micro-stuttering in the real world."

Back to the main thread:
I stand by my research indicating a SINGLE HD6950 2GB or HD6970 seems to be his best solution.

I can't recommend gaming on all three monitors for most games due to performance. I recommend:
1) gaming on the main monitor, and
2) using all three monitors for other tasks.

Gaming on all three monitors without sacrificing quality requires more processing power (this card can max out some games and is close in others, except for Shogun 2 which is the new Crysis in terms of benchmarking.)

Gaming on Eyefinity has too many issues for me to consider it:
1) Bezel
2) Crossfire required (micro-stutter)
3) Power, heat and NOISE!

If we look at the benchmark for Dirt 2, at 1920x1200 (single monitor):
http://www.hardwarecanucks.com/forum/hardware-canucks-r...

Notice that it's getting just above 60FPS which is ideal because we want 60FPS and VSYNC eats a few frames to maintain synch so this game is maxed out and getting the proper frame rate.

Imagine what would happen if we ran this on THREE monitors? We'd have to drop the quality down a lot. So basically if we used THREE monitors at 1920x1080 or 1200 we'd need 3x the processing power. In reality you'd probably use a 2xHD6970 2GB (instead of the HD6950 I was referring to) and drop the quality slightly. We're back to the problems of Crossfire again though mentioned above.

Summary:
Again I recommend a single HD6950 2GB or HD6970 2GB. I'm a fan of Sapphire when it comes to cooling if the cost isn't too much more. MSI, Gigabyte and Asus are other good quality brands but look at the NOISE.

Once I choose the model of card I then base the actual card on:
1) Quality company (Sapphire, Asus, MSI, Gigabyte etc) for good board, capacitors, voltage regulars

2) good cooling (quieter card and longer-lasting fan)

3) Cost (may be a sale on one brand)
Score
0
a c 164 U Graphics card
September 25, 2011 6:58:23 PM

SALE AT NCIX USA online:

http://www.ncix.com/products/index.php?sku=62195&vpn=11...

This Sapphire HD6950 2GB is a little too expensive IMO normally but it's on sale now for $270 with a free copy of Deus Ex Human Revolution.

(the XFX card is $40 cheaper but the Sapphire has a much better cooling solution and probably its overall quality is higher.)

http://www.sapphiretech.com/presentation/product/?cid=1...

The card I linked you to seems to be the "Dirt 3" version at Sapphire's site if the cooling solution is the same (I went by the picture's provided). It made me wonder if you could substitute Dirt 3 for Deus Ex HR if you wanted? (Deus Ex HR wasn't mentioned at Sapphire's site so I'm not sure what's going on exactly.)
Score
0
September 25, 2011 10:48:57 PM

So... NOT comparing to the constantly recommended and obviously need 6950 or higher.

-------------- The 6870 2gb will be faster than 2x 6790? --------------

I am simply answering his question and not adding a different price range.
(But if it was the 6950 vs 2x 6790's as the initial question. I would have strongly recommended the 6950 like many others do. I am no idiot with this, like people have mentioned before. I was just simply comparing exactly the price range he asked about. Around $240 NOT $290)

So again. I would completely recommend saving up for a 6950 but if you want you straight up answer...
I would like some people to contribute to his original 2x 6790 (same price as 6770) 6870 2gb and not just say 6950
Score
0
a c 376 U Graphics card
September 25, 2011 11:04:33 PM

Friznutz said:
I was just simply comparing exactly the price range he asked about. Around $240 NOT $290)

The 2 HD6790s you suggested would be $260 initially and $230 after rebate(unless he uses 2 accounts and 2 addresses to use the rebate on both cards.) I don't know where you got the $290 number from but the HD6950 2gb is also available for $260 on newegg or there is one for $240 after rebate. The price difference is negligible and the 1gb of ram on the HD6790 is still inappropriate for the resolution the OP plans to use.
Score
0
September 26, 2011 6:51:46 PM

Thanks for the feedback, everyone. I'm now looking at using this HD 6940 card from HIS:

HIS IceQ X Turbo H695QNT2G2M Radeon HD 6950 2GB 256-bit GDDR5 PCI Express 2.1 x16 HDCP Ready CrossFireX Support Video Card with Eyefinity
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

It says that it's max resolution (for a single monitor) is 2560x1600. However, my three 1920x1080 monitors would have a combined resolution of 5760x1080. Would the single 6950 be enough?

Also, can anyone tell me how much power this card sucks up at peak load? I want to make sure my power supply can handle it!

::EDIT::

I think that I found its power consumption, 200W, according to this site:
http://www.graphicscardbenchmarks.com/index.cfm
Score
0

Best solution

a c 164 U Graphics card
September 26, 2011 9:54:11 PM

If you don't use Displayport, then this card will support three monitors up to 1920x1200 for each monitor.

It can support a monitor of 2560x1600 with Dual-link. I have an HD5870 and use a single DVI output for 2560x1440 and had my second DVI output hooked up to an HDTV at the same time so I don't know what the dual-link part refers to. See your motherboard manual I guess.

The bottom line is that this monitor supports 3x1920x1200 (or 3x1920x1080) easily.

You'll likely do this:
1. DVI->DVI cable
2. DVI->DVI cable
3. HDMI->DVI cable

The HDMI->DVI cable removes the audio (which wouldn't be enabled anyway likely as it's only for onboard audio decoders for video) but you need to use it for the third output.

AMD link:
http://www.amd.com/us/products/desktop/graphics/amd-rad...

From what I've read, with DisplayPort enabled you could connect up to SIX MONITORS so this card would support up to:
3x1920x1200 + 3x2560x1600

(I don't know if they drop the 2560x1600 down if you use all six monitors.)

Power supply:
It's not just the Wattage, but also the Amps for the +12V rail or rails. I know a GTX570 needs 38Amps (this has more RAM but is slower). You'd be perfectly safe with 48Amps or higher. There's a quality, inexpensive PSU called the "Antec 620W High Power" which has 48Amps which would suit your needs. It was on sale at NCIX recently. I recommend any quality brand PSU with at least 600W and 48Amps.

SUMMARY:
- Yes, this card will work.
- PSU: 600 Watts, 48Amps minimum (quality brand)
Share
a c 164 U Graphics card
September 26, 2011 10:03:15 PM

Cables:
Monoprice (USA) or Cablesalescanada (Canada) or great sites for cables and similar items. There may be other sites.

My HDMI v1.4a cables for my HDTV were $5 each and tested with 3D. Computer cables like DVI-DVI and HDMI->DVI are similarly inexpensive.
Score
0
a c 164 U Graphics card
September 26, 2011 10:28:13 PM

At the risk of being flamed for multiple replies, but for COMPLETENESS to avoid aggravation I'll add this:

Color issues I've discovered and how to correct:

1. install monitor drivers
2. disable advanced video options like "flesh-tone correction" or your video may look grainy and washed out (video should look the same with hardware acceleration ON or OFF). CCC-> Video -> Video Settings
3. enable EDID (applies correct color options given by monitor drivers) "my digital flat panels" ->display colors
4. I experimented with Contrast and Brightness but was shocked at how easily my color changed. I believe light-grey changed to PINKISH with only a small change in brightness. I went back to the default monitor settings (not sure if this is a common monitor issue or not.)

*I discovered AMD had enabled these video "enhancements" after version 10.9 or the drivers and Hardware Acceleration of movies looked horrible (especially in dark scenes). I turned acceleration off (higher CPU usage) until I finally discovered the cause, disabled these settings and now my video looks correct.
Score
0
September 27, 2011 9:37:37 PM

Best answer selected by pcpro178.
Score
0
September 27, 2011 9:41:22 PM

photonboy said:
Power supply:
It's not just the Wattage, but also the Amps for the +12V rail or rails. I know a GTX570 needs 38Amps (this has more RAM but is slower). You'd be perfectly safe with 48Amps or higher. There's a quality, inexpensive PSU called the "Antec 620W High Power" which has 48Amps which would suit your needs. It was on sale at NCIX recently. I recommend any quality brand PSU with at least 600W and 48Amps.

SUMMARY:
- Yes, this card will work.
- PSU: 600 Watts, 48Amps minimum (quality brand)


Why is 48A needed on the +12V PCI-E power rails? 200W (peak load for HD6950) at 12V is about 17A (i.e. 200 / 12 = 16.667). Is there a reason I should expect more current draw?
Score
0
a c 164 U Graphics card
September 28, 2011 1:52:34 AM

A graphics card has TWO power numbers. Don't try to calculate them yourself.

1) Peak WATTAGE
2) Peak Amperage

For example, they'll simply list a GTX570 as requiring 38Amps on the +12V rail so you want to be on the safe side and get more than that.

If you got TWO cards in a system you would do something like this:
1) Start with about 400Watts (computer without graphics)
2) Add the PEAK WATTAGE for each card (i.e. 2x200Watts or whatever)
3) Add the Amps for each card (i.e. 2x38Amps)
4) Multiply by 1.25x to be on the safe side.

*Again, don't try to guess, go with the actual Wattage and Amps listed for the graphics card (or cards).
Score
0
a c 376 U Graphics card
September 28, 2011 3:29:40 PM

photonboy said:
A graphics card has TWO power numbers. Don't try to calculate them yourself.

1) Peak WATTAGE
2) Peak Amperage

For example, they'll simply list a GTX570 as requiring 38Amps on the +12V rail so you want to be on the safe side and get more than that.

If you got TWO cards in a system you would do something like this:
1) Start with about 400Watts (computer without graphics)
2) Add the PEAK WATTAGE for each card (i.e. 2x200Watts or whatever)
3) Add the Amps for each card (i.e. 2x38Amps)
4) Multiply by 1.25x to be on the safe side.

*Again, don't try to guess, go with the actual Wattage and Amps listed for the graphics card (or cards).

Or you could actually look up how much the card(s) and the rest of you the system will actually use... 400w for a computer without graphics is simply wrong as are the "listed" requirements for video cards from the manufacturers. No offense but if you don't understand the topic at hand it is rather bad form to give out advice as if you do... and suggesting other people avoid trying to figure out actual power requirements because you don't know how to is even worse.
Wattage and amperage are in fact the same thing. Graphics cards use the +12v rail of a power supply. Voltage x Current = Power. In this case the voltage is 12v so you just multiple the required amperage by 12 to get the wattage(or divide the wattage by 12 to get the required current.) Peak wattage is usually gotten from reviews where they measure the actually power usage of the card. The listed amperage is what the manufacturer recommends for the ENTIRE system and it is usually quite inflated to account for the vast number of shoddy PSUs out there with ratings you cannot trust.
So, following your 4 steps, what you've said above is for people to;
1) Start with 400w(which is way too high really)
2) Add the power the cards use, 2 x 200w(the GTX 570 actually maxes out around 250w under stress but we'll use the 200w); 400w + (2 x 200w) = 800w
3) THEN add the manufacturers inflated recommendation for the entire system; 2 x 38a x12v = 912w + 800w = 1712w
4) THEN multiply it by 1.25 for some reason. 1.25 x 1712w = 2140w
2000w+ PSUs don't actually exist to my knowledge and that's almost 3 times what is actually reasonably required for a system with SLIed GTX 570s. Personally I would recommend a unit from a good brand with 750w available on the +12v rail(s).
Score
0
a c 376 U Graphics card
September 28, 2011 3:34:46 PM

pcpro178 said:
Also, can anyone tell me how much power this card sucks up at peak load? I want to make sure my power supply can handle it!

The HD6950 uses about 150w on average during normal gaming and about 180w under stress testing with programs like Furmark. If you want to give it a high OC add about 30-40w to each of those numbers.
Score
0
a b U Graphics card
September 28, 2011 6:15:43 PM

This topic has been closed by Saint19
Score
0
!