Gigabyte p67 ud7 question! Please help?

binoyski

Distinguished
Dec 27, 2010
100
0
18,690
I'm thinking of buying the mobo, but I wanna be sure if I use pcie lane #4 w/ the Asus Xonar Xense pcie x1 while the the two gtx580 I'm also planning two buy, will go to pcie lane #1 & #3 running @x16 speed.

My question is when i use the SC on lane#4 will the gpus revert to x8 bandwidth? Please help if you have the same board & you're installing two gpu & a soundcard!

Thanks guys!
 

binoyski

Distinguished
Dec 27, 2010
100
0
18,690
@Leaps-from-Shadows

Even in very high resolutions, like 5760x1080? Do you mean @x8 speed will have no difference than @x16 speed?
Forgot to mention i'm also planning to buy 3x24 monitors.
 
pcie x1 souncards will work in any pci slot.
I wouldn't be concerned about using a pci slot for the soundcard.
As a matter of fact my Auzentech Forte pcie x1 has been plugged into different slots on my P55-UD5 and i didn't notice any difference.
I see Gigabyte still provides a useless pcie x1 slot like on my mobo.
I think you'd be fine using the pci2 slot for your soundcard.
I'm not sure about your original question in regards to using the other pcie slot.
 
Two comparisons for you:

x16/x16 versus x16/x8
No performance differences here.

x16/x16 versus x8/x8
Slight performance differences here, in one game only (BFBC2 at 2x AA 16x AF). And yes, that was at multi-monitor 5760x1200 resolution. While the graph showed the difference, they couldn't tell while playing the game. With your 580s the difference would be even less noticeable.

However, I don't know whether they would actually revert to x8/x8 mode anyway. The easiest way to find out would be to download the manual and read that section.
 
Duh, I see this post all of the time. The answer is it depends entirely on your setup - GPUs and the resolution of your Monitor(s).

There's no way I'm sticking a pair of GTX 580's on the equivalent of a P55 MOBO, the CPU's fine --- so is the i7 950. The CPU is N-O-T the bottleneck. Even the 'pseudo' lanes of the NF200 still yield X8/x8.

If the OP is using a single HD monitor then you really need to evaluate what game and what AA, I prefer either 8XAA or 16XAA. A single GTX 580 running Crysis on a 1920 x 1080 monitor will yield 55~60 FPS, COD 115~120 FPS, etc.

Therefore, the only benefit of SLI GTX 580 is running 3 Monitors ~5900 X 1080 {bezzeling} or a single 30" 2560X1600 -- EITHER CASE for lose frame rates at x8/x8 - period.

Solution, wait for the 'X68' LGA 2011 or build it off the X58 x16/x16 now; at Micro Center you can {walk-in pricing only} get an i7 950 for $199.

Building Chart:
Build_Chart_Q1-2011-1.jpg
 

binoyski

Distinguished
Dec 27, 2010
100
0
18,690
@ jaquith

Thanks man! Finally an honest answer from someone! And yeah, I think I'll just wait for X68. Trust me I've debated hard before launch of sandy bridge to build a rig on it. I have a budget of $5000+ & adding more when waiting for X68. I'm been gaming on one monitor from the start, I just wanna experience eyefinity or surround!

But jaquith, what monitors should i go for, 3x 120Hz alienware optx in landscape or should i get 3x dell u24 that i can game w/ in portrait, w/c I seem to be love with? Can the optx do portrait? The Benq xl2410t have a stupid bezel that have the OSD controls sticking out under the monitor, that should have been squared. And I just hope the two camps of gpu giants create a top of the line gpu corresponding @ the time of launch of X68.

And where did you find that chart?
 
"Finally an honest answer"!?!?

You offend me... :(

My answer was both honest and correct ... the theoretical impact (50% loss) is MUCH greater than the actual impact (less than 10% loss). You likely wouldn't notice that unless the FPS is right around 30 or so. At 40 FPS or higher you wouldn't notice a 10% loss.
 

binoyski

Distinguished
Dec 27, 2010
100
0
18,690


OK, I did not mean to offend you sir! But since you mentioned "less than 10% loss", then let me explain why that is unacceptable to me. The $1000+ I shall spend on the 2 580s that when used @dual x8 or x16 speed that will result in a 10% loss, but what if I buy instead of 2x 580 I buy 2x 570s/6950s/6970s so the money I save I can buy a better speaker for my rig, will the %10 loss you mentioned that I wouldn't notice will still be present to those alternate cards & @ that resolution. I just want a BIG BANG for my BUCK! I've worked truly hard for that money. Imagine a man of late 20s having a crappy 9-5 job, then @ weekends mowing lawns & other odd jobs, w/c right now I'm shoveling snow for other people 'cuz of the weather. Imagine that I've saved that money I've saved for two "brutal" years. As I've said, "please" a "BANG for my BUCK!" And I wanna game on three monitors!
 
Currently I run 3 120 Hz Acer GD235HZbid for nVidia Surround/3D Vision http://www.newegg.com/Product/Product.aspx?Item=N82E16824009222 they are, as is any 120 Hz gaming monitor, expensive but very cool! For more info -> http://www.nvidia.com/object/3d-vision-home-users.html The Acer's have been good to me, also any 120 Hz monitor should work fine with '3D Vision'; I like the responsiveness and the lack of motion blur and frame chop. I know Dell/Alienware has good monitors, and what brief reviews I've seen have Pros & Cons like anything else. A lot of people remove the OEM stands so Landscape/Portrait may not matter; check the specs.

Frankly, to me bigger is always 'better' if it looks good to 'my eyes' that's all that matters; I don't have a built-in calibrator in my head. Meaning, I relish those that can get a good 27" 120 Hz that should be released soon; e.g. Acer HN274H.

Videos:
120Hz Monitor - http://www.youtube.com/watch?v=6CML9GaMSdg
nVidia Surround/3D Vision - http://www.youtube.com/watch?v=Xru6i0aIi24
Monitor stand opt/example - http://www.youtube.com/watch?v=vXXSqce04fE

AMD and nVidia will be at each other's throats until one company is dead. I made the chart myself after reviewing several overlapping benchmark results.