Sign in with
Sign up | Sign in
Your question

5 way Crossfire

Last response: in Graphics & Displays
Share
April 10, 2012 7:17:37 AM

G'day all,

Any ideas if the above contraption could work.

What I am looking at is having a 6970 in between 2 6990's.

Or a 7970 between 2 7990's.

Use the long bridge connectors available and use the 2 bridges on the middle card to connect to the 1st and 3rd cards.

Your insights would be most..... insightful!!! :lol: 

More about : crossfire

April 10, 2012 10:36:51 AM

You can't :) . a 7970 and 6970 only support up to Quad crossfire. And even though it support Quint-Crossfire, I would not recommend that.
April 10, 2012 3:31:58 PM

refillable said:
You can't :) . a 7970 and 6970 only support up to Quad crossfire. And even though it support Quint-Crossfire, I would not recommend that.

Thanks refillable,

In case of single slot liquid cooled 7970's is there any reason that 5 cards couldn't work on a MB with 5 slots. For whatever reason if they could, then a 3 card quint config should be possible, shouldn't it??
Related resources
April 10, 2012 3:50:30 PM

According to AMD (Hover to AMD Crossfire details):

AMD CrossFire™ multi-GPU technology:
Dual, triple or quad-GPU scaling (Note: NO QUINT OR ABOVE!!!)

So even you successfully have the setup, the fifth card will still be useless.
April 10, 2012 3:54:44 PM

And for the cost vs. improvement in framerate there would be no feasible reason to do this. It would serve no purpose, unless you are trying to run a farm that processes data against the GPUs, not playing games.

That is the only reason I could see for cramming so many GPUs into a single machine.
April 10, 2012 4:17:32 PM

Adding to po1nted's topic, the 5970 gpu computes better than the 6000 series for things like bitcoin. You would be better off not buying anything for that purpose anyway, since bitcoin has been 'mined out' as of late. You can't turn a profit with it anymore.
April 10, 2012 4:18:37 PM

po1nted said:
And for the cost vs. improvement in framerate there would be no feasible reason to do this. It would serve no purpose, unless you are trying to run a farm that processes data against the GPUs, not playing games.

That is the only reason I could see for cramming so many GPUs into a single machine.

Well, I was thinking of going 2560p 3x eyefinity portrait mode (I am on 1600p right now). That is 12.2MP, that needs a lot of GPU at max detail!! :bounce: 
April 10, 2012 4:21:46 PM

Go with four!
April 10, 2012 4:23:37 PM

refillable said:
According to AMD (Hover to AMD Crossfire details):

AMD CrossFire™ multi-GPU technology:
Dual, triple or quad-GPU scaling (Note: NO QUINT OR ABOVE!!!)

So even you successfully have the setup, the fifth card will still be useless.

Sooo the 5th adapter will be shown as 'disabled adapter' in catalyst hardware info??
April 10, 2012 4:26:01 PM

stingstang said:
Adding to po1nted's topic, the 5970 gpu computes better than the 6000 series for things like bitcoin. You would be better off not buying anything for that purpose anyway, since bitcoin has been 'mined out' as of late. You can't turn a profit with it anymore.

Have been sussing it out but knew there were gonna be issues, so never really got on board!
April 10, 2012 4:31:36 PM

even 4way is problematic. Unless you love tweaking settings and trouble shooting compatibility before playing every game, it's not worth it.

3 680s would run a 3x2560x1600 quite well. As would 3 7870s 7970s.

Watercooled and overclocked, your machine would be a beast. Currently, only the x79 platform would handle it well. You would need a heavy cpu overclock as well.

total cost would be around:
>$1500 in gpus
$600 + >$250 in cpu and mobo
>$400 in waterblocks
>$300 in pump, res, rad, fans, etc
>$250 in PSU
$200-$500 in the rest of the system
$2400-$3500 in screens.

total of at least $5900. Ouch.
April 10, 2012 4:46:54 PM

If you absolutely are getting THREE monitors, the best choice is this:

2xGTX680 4GB

(4GB models coming soon. Should add about $50 so $550 would be the cheapest.)

SLI and Crossfire don't add VRAM, it is cloned. If you don't buy cards with more VRAM you will end up with a memory bottleneck at the resolution of three monitors.

That's why the HD7970 has 3GB, it's for Crossfire not single card configurations. NVidia wanted to keep the cost down so the GTX680 has 2GB which is plenty. There are rare scenarios where a little more memory would help such as Total War Shogun 2.

SUMMARY:
- single GTX680 2GB for single monitor high-end gaming
- 2xGTX680 4GB for triple-monitor gaming
April 10, 2012 5:02:25 PM

^right, forgot the 680 only has 2GB version out right now.
But two 680/7970 aren't going to play 3x2560x1600 at max settings. And if you're spending a minimum of $2400 on screens, you're clearly not on a budget.
April 10, 2012 5:05:46 PM

slicedtoad said:
even 4way is problematic. Unless you love tweaking settings and trouble shooting compatibility before playing every game, it's not worth it.

3 680s would run a 3x2560x1600 quite well. As would 3 7870s.

Watercooled and overclocked, your machine would be a beast. Currently, only the x79 platform would handle it well. You would need a heavy cpu overclock as well.

total cost would be around:
>$1500 in gpus
$600 + >$250 in cpu and mobo
>$400 in waterblocks
>$300 in pump, res, rad, fans, etc
>$250 in PSU
$200-$500 in the rest of the system
$2400-$3500 in screens.

total of at least $5900. Ouch.

About 4k more for my other 2 NEC PA301W screens :o 
Wanted to save costs on that WC setup, so was looking on that 3 card 5 way setup. Have 1.5kw PSU, may need more.
2GB might just be on the lower end. GK110 would have to be it. 7870 is low spec I think especially considering 7970 can't beat 680.
April 10, 2012 5:11:03 PM

one less card only saves you $100-$200 in WC equipment.

i meant 7970, lol, mistyped that. And as photon pointed out, the 680 will have problems at that resoltution due to vram. So 7970 would be the way to go atm.

btw, how serious are you about this?

edit: why are you using color perfect monitors? they are for professional photo/movie editing, not gaming. And they cost a fortune.
April 10, 2012 5:16:03 PM

photonboy said:
If you absolutely are getting THREE monitors, the best choice is this:

2xGTX680 4GB

(4GB models coming soon. Should add about $50 so $550 would be the cheapest.)

SLI and Crossfire don't add VRAM, it is cloned. If you don't buy cards with more VRAM you will end up with a memory bottleneck at the resolution of three monitors.

That's why the HD7970 has 3GB, it's for Crossfire not single card configurations. NVidia wanted to keep the cost down so the GTX680 has 2GB which is plenty. There are rare scenarios where a little more memory would help such as Total War Shogun 2.

SUMMARY:
- single GTX680 2GB for single monitor high-end gaming
- 2xGTX680 4GB for triple-monitor gaming

or the Sapphire Toxic 6GB 7970- another TBR model. 3x cards would still be needed IMO.
April 10, 2012 5:27:30 PM

slicedtoad said:
one less card only saves you $100-$200 in WC equipment.

i meant 7970, lol, mistyped that. And as photon pointed out, the 680 will have problems at that resoltution due to vram. So 7970 would be the way to go atm.

btw, how serious are you about this?

edit: why are you using color perfect monitors? they are for professional photo/movie editing, not gaming. And they cost a fortune.

If I lose my mind I might. Then again u could say that I could afford that 5 way attempt anyway. If the 5 way was to work I would really consider it. Rampage iv and 3960x combo with 32-64GB RAM for base system.
Outside work, my life is 'digital reality', also they had good reviews so I bought 1. Good for RTS, racing, GTA, so no issues for me, get immersed in the colour perfection of a game's intended colour scheme!! 1 billion colours is great, though not sure if the cards can actually display them??
April 10, 2012 5:29:51 PM

slicedtoad said:
one less card only saves you $100-$200 in WC equipment.

i meant 7970, lol, mistyped that. And as photon pointed out, the 680 will have problems at that resoltution due to vram. So 7970 would be the way to go atm.

btw, how serious are you about this?

edit: why are you using color perfect monitors? they are for professional photo/movie editing, not gaming. And they cost a fortune.

Also with 3 card setup I wouldn't need any WC with the right MB.
April 10, 2012 5:37:54 PM

Even if 5 way worked, and if it scaled well, it would be overkill.

4-way would already get you 60fps with max graphics on any game (unless the cpu turns out to be a bottleneck). Even 3way should be able to get you around 60fps.

Damn, just talking about it makes me wish i had 6 grand to blow. Watercooling is too fun.

edit: try using the edit feature when you want to add something to your post, rather than double posting.
April 10, 2012 5:45:12 PM

slicedtoad said:
Even if 5 way worked, and if it scaled well, it would be overkill.

4-way would already get you 60fps with max graphics on any game (unless the cpu turns out to be a bottleneck). Even 3way should be able to get you around 60fps.

Damn, just talking about it makes me wish i had 6 grand to blow. Watercooling is too fun.

edit: try using the edit feature when you want to add something to your post, rather than double posting.

True, edit button, I also meant no WC for that 3 card 5 way setup, 6 slots covered, fit easy. Scaling will be a big issue for sure in 5 way :cry: 
April 10, 2012 6:00:33 PM

3 cards is a hot setup, no matter what mobo you use. Especially if you have dual gpu cards in there. But with good ventilation, 3 7970s would be fine on air.
April 10, 2012 6:05:36 PM

slicedtoad said:
3 cards is a hot setup, no matter what mobo you use. Especially if you have dual gpu cards in there. But with good ventilation, 3 7970s would be fine on air.

Have a NZXT Phantom with all fan mounts filled (7 I think). May look at 10 slot case, perhaps not required though.
April 10, 2012 6:19:55 PM

wait, are you actually doing this 3 way setup, or are we still just speculating?

read this article:
http://www.tweaktown.com/articles/4517/sapphire_radeon_...

They're on an open bench and still getting over 80C on the hottest card. Not the end of the world, i suppose. It also gives you an idea of the performance. You can see that some games get bottlenecked by the cpu. This might be different with three screens though.
April 10, 2012 6:29:08 PM

slicedtoad said:
wait, are you actually doing this 3 way setup, or are we still just speculating?

read this article:
http://www.tweaktown.com/articles/4517/sapphire_radeon_...

They're on an open bench and still getting over 80C on the hottest card. Not the end of the world, i suppose. It also gives you an idea of the performance. You can see that some games get bottlenecked by the cpu. This might be different with three screens though.

Speculation only rests whether it is 4 way or 5 way :bounce: 

Will OC the primary GPUs 1 and 2 only, scaling should cover my needs hopefully (in games with CFX CAP available). Don't mention GTA IV- I hate that useless engine!!! Also tks for the link, will check it out.

CPU will be OC obviously, otherwise I would be going E5-2687Ws on a Z9PE-D8 WS. If only :fou: 
April 10, 2012 7:02:36 PM

7970s perform better at high res than 680s, up to 5760x1200 (highest resolution I've seen tested).

3x or 4x 7970s would be more than enough for 2560x1600x3 at max details (a single card can push a single 1600p monitor pretty well)
April 10, 2012 7:08:35 PM

LiquidAMD said:
Thanks refillable,

In case of single slot liquid cooled 7970's is there any reason that 5 cards couldn't work on a MB with 5 slots. For whatever reason if they could, then a 3 card quint config should be possible, shouldn't it??


I can sell you my toaster for half the price. Should have the same effect.
April 10, 2012 7:42:36 PM

LiquidAMD said:
Speculation only rests whether it is 4 way or 5 way :bounce: 

Will OC the primary GPUs 1 and 2 only, scaling should cover my needs hopefully (in games with CFX CAP available). Don't mention GTA IV- I hate that useless engine!!! Also tks for the link, will check it out.

CPU will be OC obviously, otherwise I would be going E5-2687Ws on a Z9PE-D8 WS. If only :fou: 

Sorry, all clocks need to be the same across the cards. If you oc one, you oc them all. And Xeons would perform worse in games then a stock clocked i7.

5 way is impossible (I suppose you could theoretically write your own drivers).
4 way is usually only considered for bench marking since it's a huge pain to actually game with.

3 way is ideal (well, cf is never ideal but at your res i suppose it's necessary)

I'd still recommend wc for that kind of setup if your budget allows. Gaming with 80+ temps isn't ideal. And i can't imagine what it would sound like.
April 10, 2012 8:49:23 PM

Murissokah said:
I can sell you my toaster for half the price. Should have the same effect.

And you would have done exceeding well for yourself selling me that toaster of yours :lol:  I don't think I have caught your drift completely mate, do you mean that it will run too hot :??: 
April 10, 2012 8:54:26 PM

^and be useless since quintfire is not supported.
April 10, 2012 9:01:15 PM

slicedtoad said:
Sorry, all clocks need to be the same across the cards. If you oc one, you oc them all. And Xeons would perform worse in games then a stock clocked i7.

5 way is impossible (I suppose you could theoretically write your own drivers).
4 way is usually only considered for bench marking since it's a huge pain to actually game with.

3 way is ideal (well, cf is never ideal but at your res i suppose it's necessary)

I'd still recommend wc for that kind of setup if your budget allows. Gaming with 80+ temps isn't ideal. And i can't imagine what it would sound like.

That's why not getting those beauties (could have a false hope of games being multithreaded someday and use their power :sarcastic: ...) But with only 200mhz difference maybe not, maybe at stock the Xeon would be faster- 5MB more cache, plus 2 more cores)

Tell me about it. I am on 4 way now with 6870x2s

I am sure you can select custom clocks for your GPUs.

My plan would be to keep GPU 0 & 1 at higher clocks (applying diminishing returns here- GPU 0 to have the best performance across all games (except where CFX is totally broken and negative scaling occurs)
GPU 1 to be clocked where 2x CFX works (if a 7990 then both 1 & 2 will have same clocks obviously).
The remaining to be stock as FPS will hopefully be high enough not to matter if scaling works.

:cry:  I can't write drivers
April 10, 2012 9:02:50 PM

slicedtoad said:
^and be useless since quintfire is not supported.

ah.....obviously
April 10, 2012 11:12:40 PM

You're right, you can use different clock speeds. But it's not recommended since it increases micro stutter.
April 10, 2012 11:36:42 PM

slicedtoad said:
You're right, you can use different clock speeds. But it's not recommended since it increases micro stutter.

Point taken, but could the frametimes really be that off to be visible, just on a clock speed difference? I suppose maybe.
April 11, 2012 2:04:35 AM

If your framerate is much below 60, micro stuttering is visible. Simply because one gpu completes a frame faster than the others. Some people notice and some don't, along with everyone in between.
April 11, 2012 3:46:06 AM

^not entirely true. You can get very smooth frames with crossfire and under 60fps. But yes, the problem is generally thought to be from one GPU rendering faster than another.

It's also been shown that having 3 GPUs generally gets rid of the effect - certainly to a point that you probably wouldn't notice even at it's worst.
April 11, 2012 7:52:09 AM

wolfram23 said:
^not entirely true. You can get very smooth frames with crossfire and under 60fps. But yes, the problem is generally thought to be from one GPU rendering faster than another.

It's also been shown that having 3 GPUs generally gets rid of the effect - certainly to a point that you probably wouldn't notice even at it's worst.


In theory, Micro-stutter can be easily eliminated completely. Simply synch the cards so they operate in equal times.

In fact, does micro-stutter still exist if a game is running at 60FPS VSYNC?

With VSYNC disabled a normal, single card simply generates as many frames as possible (regardless of whether the monitor can display them all). With SLI and VSYNC disabled this still holds true. However the cards simply toggle every other frame and one card could take slightly longer than the other due to small differences in the graphics. I would think in VSYNC @ 60FPS the monitor should simply draw each frame in 1/60th of a second.

If anyone has noticeably bad microstutter maybe they can answer this. I may even try to contact NVidia, but the more I think about it the more it seems to me that VSYNC would eliminate microstutter (provided the cards are fast enough to sync at 60FPS and aren't dropping below it.)
April 11, 2012 8:10:07 AM

LiquidAMD said:
Well, I was thinking of going 2560p 3x eyefinity portrait mode (I am on 1600p right now). That is 12.2MP, that needs a lot of GPU at max detail!! :bounce: 


FYI, I have a U2711 monitor which is 2560x1440.

I've discovered that most games actually look and run better at 1920x1080. Very few textures, if any, actually make full use of a 1920x1080 screen so scaling to a higher resolution often does nothing for quality. Without anti-aliasing enabled, higher resolutions do look better, however you can actually anti-aliase at a higher setting at 1920x1080 and the result is a slightly better picture at a LOWER frame rate.

I've been flamed about the texture thing before with people saying, no they have 1024x1024 or 2048x2048 but they don't really understand what's going on. It's easier to simply point out that if textures were really making full use of a screens resolution they would be photo-realistic.

*So this is a long way of saying a 2x(GTX680 4GB) setup would likely work best at 5760x1200 rather than 7680x1600.

**It's also worth investigating how many games even optimize for 3xSLI.

(I've tested the 2560x1440 vs 1920x1080 on quite a few games. I'll need to redo this once I get my GTX680 but I suspect that with better AA methods it will lean even more towards 1920x1080 as being the best choice. Actually some games have UI's that don't scale properly to higher resolutions anyway so it's not even an option.)
April 11, 2012 8:37:01 AM

photonboy said:
FYI, I have a U2711 monitor which is 2560x1440.

I've discovered that most games actually look and run better at 1920x1080. Very few textures, if any, actually make full use of a 1920x1080 screen so scaling to a higher resolution often does nothing for quality. Without anti-aliasing enabled, higher resolutions do look better, however you can actually anti-aliase at a higher setting at 1920x1080 and the result is a slightly better picture at a LOWER frame rate.

I've been flamed about the texture thing before with people saying, no they have 1024x1024 or 2048x2048 but they don't really understand what's going on. It's easier to simply point out that if textures were really making full use of a screens resolution they would be photo-realistic.

*So this is a long way of saying a 2x(GTX680 4GB) setup would likely work best at 5760x1200 rather than 7680x1600.

**It's also worth investigating how many games even optimize for 3xSLI.

(I've tested the 2560x1440 vs 1920x1080 on quite a few games. I'll need to redo this once I get my GTX680 but I suspect that with better AA methods it will lean even more towards 1920x1080 as being the best choice. Actually some games have UI's that don't scale properly to higher resolutions anyway so it's not even an option.)

Agreed, texture resolution hardy improves in any game, just filling the realestate with more of them. My big problem is 8xAA at 1600p. Stupid 1GB no good in half the games.

I have a SA950 for 3D, but can only go landscape eyefinity on that (bezel issues). Other problem being Tridef and Catalyst being a botchy setup with CFX and eyefinity 3D together actually working less than half the time :fou: .

A lot of the games will need manual changes to work at 2560p, as in launch options or config files... Such as AOM at 1600p :lol: . But many of the new games have a fair set of res options.
April 11, 2012 9:26:51 AM

LiquidAMD said:
Agreed, texture resolution hardy improves in any game, just filling the realestate with more of them. My big problem is 8xAA at 1600p. Stupid 1GB no good in half the games.

I have a SA950 for 3D, but can only go landscape eyefinity on that (bezel issues). Other problem being Tridef and Catalyst being a botchy setup with CFX and eyefinity 3D together actually working less than half the time :fou: .

A lot of the games will need manual changes to work at 2560p, as in launch options or config files... Such as AOM at 1600p :lol: . But many of the new games have a fair set of res options.


Your games will likely run best at 1920x1200 plus whatever balance of quality settings enables you to run at VSYNC.

Have you seen the performance hit going from that resolution to 2560x1600? Ouch! So then you have to actually LOWER the quality settings or suffer lower frame rates.

The ONLY scenario in which I consider gaming at 2560x1600 is meeting ALL of this criteria:
- achieve 60FPS with ALL quality settings at maximum
- Looks noticeably BETTER than 1920x1080 at max
- no UI scaling issues
April 11, 2012 9:44:21 AM

photonboy said:
Your games will likely run best at 1920x1200 plus whatever balance of quality settings enables you to run at VSYNC.

Have you seen the performance hit going from that resolution to 2560x1600? Ouch! So then you have to actually LOWER the quality settings or suffer lower frame rates.

The ONLY scenario in which I consider gaming at 2560x1600 is meeting ALL of this criteria:
- achieve 60FPS with ALL quality settings at maximum
- Looks noticeably BETTER than 1920x1080 at max
- no UI scaling issues

Can't better anything over 40fps. Sit between 20-40 for most at max. I like 1600p for map coverage mostly. UI overlay is an issue in some games, in which case I lower res to what works best. That is why need to upgrade again.
April 11, 2012 2:20:27 PM

photonboy said:
Very few textures, if any, actually make full use of a 1920x1080 screen so scaling to a higher resolution often does nothing for quality.


What? That doesn't even make sense. Textures never scale, they are what they are. Set them to Ultra and that's it, of course they aren't going to improve. A higher resolution display allows you to see more on screen with a better picture quality. That's it.

April 11, 2012 4:37:03 PM

wolfram23 said:
What? That doesn't even make sense. Textures never scale, they are what they are. Set them to Ultra and that's it, of course they aren't going to improve. A higher resolution display allows you to see more on screen with a better picture quality. That's it.


NO.
I just checked several games including Torchlight and Crysis #1 and every one of them showed EXACTLY the same viewing area, with all objects the same size from 1280x720 up to 2560x1440. In some cases the HUD overlay may change.

There MAY be some games that show more on a single monitor with higher resolution.

TEXTURE SCALING:
Uh, you are actually agreeing with me. I said going from 1920x1080 to 2560x1440 didn't affect the texture quality and you said "of course they aren't going to improve."

2560x1440 vs 1920x1080:
Torchlight improved with each increase in resolution. However, many games don't work this way. Here's an example:

Game A (HIGH)
Setting #1: 1920x1080 @ 8xAA, running 60FPS

Setting #2: 2560x1440 @ 2xAA, running 45FPS

*So to compare:
- viewing area is the same
- texture quality is probably identical
- 8xAA at the lower resolution often looks BETTER than the lower AA offered at the higher resolution
- Frame Rate is LOWER on the 2560x1440.

**So Torchlight might be capable of running at 60FPS and looking better at 2560x1440, however many games take a huge hit scaling to the higher resolution which means the player has to either LOWER the quality to keep frame rates high (i.e. 60FPS) or he keeps the quality high but plays with the lower frame rates.

The user has to investigate this on a per-game basis as it depends on the game and the gaming system, but the fact is that 1920x1080 runs better than 2560x1440 most of the time.
April 11, 2012 5:39:48 PM

skyrim (if you jump through the right hoops to make it display right) gives a broader field of view at higher resolutions. So do driving games. Most games don't though.
April 11, 2012 7:02:25 PM

photonboy said:
NO.
I just checked several games including Torchlight and Crysis #1 and every one of them showed EXACTLY the same viewing area, with all objects the same size from 1280x720 up to 2560x1440. In some cases the HUD overlay may change.

There MAY be some games that show more on a single monitor with higher resolution.

TEXTURE SCALING:
Uh, you are actually agreeing with me. I said going from 1920x1080 to 2560x1440 didn't affect the texture quality and you said "of course they aren't going to improve."

2560x1440 vs 1920x1080:
Torchlight improved with each increase in resolution. However, many games don't work this way. Here's an example:

Game A (HIGH)
Setting #1: 1920x1080 @ 8xAA, running 60FPS

Setting #2: 2560x1440 @ 2xAA, running 45FPS

*So to compare:
- viewing area is the same
- texture quality is probably identical
- 8xAA at the lower resolution often looks BETTER than the lower AA offered at the higher resolution
- Frame Rate is LOWER on the 2560x1440.

**So Torchlight might be capable of running at 60FPS and looking better at 2560x1440, however many games take a huge hit scaling to the higher resolution which means the player has to either LOWER the quality to keep frame rates high (i.e. 60FPS) or he keeps the quality high but plays with the lower frame rates.




The way you phrased the texture scaling thing was... weird. I guess I don't know what you mean by "Very few textures, if any, actually make full use of a 1920x1080 screen"

Moving on, yes that makes sense. You increase the resolution so everything becomes crisper on screen as more pixels are showing off the same image. The aspect ratio is the same so you aren't going to zoom out/increase viewing area. Instead, the image becomes more photorealistic - natural aliasing is occurring less. However, with a higher resolution you can change this little thing called Field of View and gain a wider viewing area.

The user has to investigate this on a per-game basis as it depends on the game and the gaming system, but the fact is that 1920x1080 runs better than 2560x1440 most of the time. said:
The user has to investigate this on a per-game basis as it depends on the game and the gaming system, but the fact is that 1920x1080 runs better than 2560x1440 most of the time.


Um... duh? You are increasing the amount of pixels by 77% - obviously it's going to hurt your performance. If you don't have the hardware to push it then yeah I'd probably agree that 1080p at Ultra looks better than 1440p at Medium/High. I ran 5760x1080 for a while and BF3 ran well at Medium with a few settings at High, whereas I can easily do Ultra on a single 1080p. There was various reasons that I returned the monitors, but performance and image quality were not among them.

Some people with mid range hardware at 1080p displays play their games at 1680x1050 or 720p so that they can use higher settings. That's fine. Personally? I'd rather the high resolution and lower settings - especially if it's just AA that needs to be lowered. I'd also make sure that I have decent hardware to begin with if I want to game with high settings at high resolution; spending $700+ on a monitor and only $200-300 on a GPU is kind of weird. Most people with 1080p displays spent double or triple the monitor's cost on their GPUs if they want max settings.
April 12, 2012 10:55:17 PM

Thanks everyone for your participation.

3960x c2 stepping on its way. Got for 936.

MB recommendations anyone?

Asus Rampage iv extreme
Asrock Fatal1ty X79 Professional
Evga X79 FTW
Gigabyte ga-x79-ud7

only MBs I could find with x16,x8,x16 electrical at x16 physical slot position 1,3,5.

any other options with above layout?
April 12, 2012 11:18:46 PM

A lot of people swear by the MSI Big Bang II x79, but it's hard to find.
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

It's x16,x8,x8 in three way. Any reason for needing slot 5 in x16? A pcie 3.0 slot running at x8 won't be a bottleneck for the 7970. At least not for gaming, gpu compute situations are another matter.
April 12, 2012 11:28:01 PM


Thanks,

you look too hard you miss, can't imagine how i missed the rampage iv extreme after having gone through the manual!! 8 drams as well :D .

Not sure that the WS offers the config I am after.

Slot No. Slot Description
1 PCIe 2.0 x16_1 slot (single at x16 or dual at x8/x8 mode)
2 PCIe 2.0 x16_2 slot (x8 mode)
3 PCIe 2.0 x16_3 slot (x4 mode)
4 PCIe 2.0 x16_4 slot (single at x16 or dual at x8/x8 mode)
5 PCIe 2.0 x16_5 slot (x4 mode)
6 PCIe 2.0 x16_6 slot (x8 mode)

Also added the evga x79 ftw to the list
April 12, 2012 11:30:48 PM

slicedtoad said:
A lot of people swear by the MSI Big Bang II x79, but it's hard to find.
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

It's x16,x8,x8 in three way. Any reason for needing slot 5 in x16? A pcie 3.0 slot running at x8 won't be a bottleneck for the 7970. At least not for gaming, gpu compute situations are another matter.

stuck up on 5 way exploration??? Realy wish the xpower ii had that config...
!