Sign in with
Sign up | Sign in
Your question

SLI at 8x vs 16x

Last response: in Graphics & Displays
Share
May 20, 2007 6:56:44 AM

Hi All,

Just curious as i'm thinking of buying a second 8800GTX for SLI in the next month.

What kind of performance differences are there between a mobo that runs the PCIE slots at 8x and one that runs them at 16x?

Reason i ask is because i would rather spend half the money and buy a cheaper board if there is no major difference. In New Zealand it will cost me around $200 for a good SLI 8x mobo from Asus.

A fully-fledged 16X board will cost me at least $500 however.

Can someone lend some guidance on this?

More about : sli 16x

May 20, 2007 7:25:57 AM

Less then 5%.

Bandwith is not really importent for graphics card.
The idea with the fast connects like AGP and PCI-E, is to be able to
have small amount of v-ram and user ordinary ram from the computer.

So bandwith is no problem as long you have enought of Vram.

Guess:
128 meg for 0.7 megapixel resolution
256 meg for 1.6 megapiexel
512 meg for 2mega
1024 meg for optimal 4 mega.

There are no games that use 1024 even at 4megapixel. 512 is ok.
May 20, 2007 8:50:33 AM

Quote:
Less then 5%.


Actually in DX10 games, at very high resolutions there could be up to a 20% difference with the 8800s.
Related resources
May 20, 2007 8:53:10 AM

Thanks for the reply Shompa.

Good to know i don't have to save up for one of these overpriced mobo's.

@Track - is there anything to substantiate this? I've been too busy to look for any DX10 benchmarks. What feature of DX10 would cause bandwidth to take on more significance?
May 20, 2007 11:45:03 AM

Quote:

@Track - is there anything to substantiate this? I've been too busy to look for any DX10 benchmarks. What feature of DX10 would cause bandwidth to take on more significance?


Theres always substantial evidence to what I say, but i dont seem to remmember what review its from.

And its not a feature of DX10, its got nothing to do with Direct3D.
Its the 8800s and their incredible performance wich requires more bandwidth. Possibly having to do with the shader archtiecture.
May 20, 2007 12:54:51 PM

To piggy-back on what Track is saying a little bit, I remember reading (I think in MaximumPC?) about the difference when I was trying to decide between the 650 and 680 series mobo's since one of the noticeable differences was the 2x8 or 2x16. anyway, the article talked about how right now there isn't anything that would push that hard meaning that it wouldn't even need either one to be at 8x. it did say though that dx10 future programming will push everything past that and require it due to Vista dumping so much onto the gpu as opposed to the cpu, plus the additional dx10 features and architecture.

what track is saying is also backed up i guess through nvidia's claims on the newer architecture for the g84 and g86 to be able to process hdcp, etc. with the smaller 128bits as opposed to the beefier (but older arch.) g80's.

well, this all made sense in my head, hopefully it does typed out too lol.
May 20, 2007 1:07:17 PM

Well, the 8600s dont need a 16x slot because they are much less powerfull.

Also, it dosent have anything to do with Vista.
May 20, 2007 1:44:02 PM

how would it not have anything to do with vista? you may be right but i don't understand how because one of the major overhauls of vista is offloading much of the power from the cpu to the gpu - and not just for aero because the requirements for the basic call for a decent gpu also. am i missing something?

and yes, the 8600's and lower are less powerful - that was part of the explanation that i figured wouldn't need to be said, BUT they are "supposed" to be able to do a lot with smaller bandwith which is where they are so far falling very short...i was very disappointed since i was waiting on those to come out and then ended up just getting the 8800gts320.
May 20, 2007 3:51:34 PM

Vista does not offload anything to the GPU. The GPU has its own things to do and the CPU has its own things. Vista does take more off of the GPU when not gaming, but it does not make the GPU perform any of the tasks that the CPU is supposed to be performing.

And i really dont understand what ur saying about the 8600 GT. It has less memory bandwidth, yes. It needs less PCIe slot bandwidth, yes.
May 20, 2007 5:22:40 PM

So you an afford two 8800GTXs, but you can't afford a decent motherboard with dual 16x slots? You can't find anything like this in New Zealand?
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

In all likelihood, it won't make a huge difference to performance, but I figure if you're going to spend $1200 USD on videocards, you can afford the right motherboard.
May 20, 2007 5:56:53 PM

@Heyyou27: LOL mate that was exactly the model i was looking at. Seems to be the best for the price.

@Track: I guess if it's going to make a bigger performance difference in the longrun, i could always upgrade later, especially since there will likely be newer and better processors out that will require an upgrade (e.g. Penryn).

For now though, i would rather save myself a few hundred seeing as i don't get the money for this kind of upgrade very often.
May 20, 2007 6:20:39 PM

Quote:
@Heyyou27: LOL mate that was exactly the model i was looking at. Seems to be the best for the price.

@Track: I guess if it's going to make a bigger performance difference in the longrun, i could always upgrade later, especially since there will likely be newer and better processors out that will require an upgrade (e.g. Penryn).

For now though, i would rather save myself a few hundred seeing as i don't get the money for this kind of upgrade very often.
Yeah I see, it's not that big of a deal. What kind of monitor are you playing with anyways? I've got a single 8800GTX @ 633MHz/2050MHz and I play at 1680x1050. If you're not playing at a resolution of 1920x1200 or higher, you're likely going to waste money buying a second 8800GTX.
May 21, 2007 6:07:19 AM

I'm exactly the same as you mate. 1680x1050 with a single GTX. My monitor is a 22" widescreen.

I started up Quake 4 the other day - i play at Ultra settings with 2xAA and 4xAF. I was quite disappointed that FPS could drop below 60 at those settings - my old 7950GX2 could to better :cry:  .

I thought getting a second GTX could solve all my performance woes...but now i'm not so sure...the other concern i have is how well they will do in DirectX 10 as these newer games are very close at hand.

Maybe better to wait for the second generation of DirectX 10 cards that should perform a lot better. In my experience games like Crisis are made with the next generation hardware in mind so that they evolve into their prime. If history repeats itself (i'm thinking of Oblivion), the 8800s will most likely struggle quite frequently.

What are your thoughts on this?
May 21, 2007 8:15:54 AM

Your 8800 GTX cant do Quake 4 @ 1680x1050 with 2xAA?

Uh.. do u have an Athlon XP? You should be getting 60+ at 1920x1200 with 4xAA.
May 21, 2007 9:19:47 AM

Quote:
Your 8800 GTX cant do Quake 4 @ 1680x1050 with 2xAA?

Uh.. do u have an Athlon XP? You should be getting 60+ at 1920x1200 with 4xAA.


Alas if i could get that kind of performance...what a dream. That's why i got this card in the first place lol. My specs are:

Windows XP Pro SP2
Intel Core 2 Duo E6600 @ 2.93 GHz
EVGA 8800GTX 768MB 630/2060 with Forceware 158.22
2GB DDR2 667
Creative Audigy 2 ZS
Gigabyte GA-965P-DQ6
Enermax Galaxy 1000W
Antec 900

It's quite a good system but Quake 4 seems to croak. For example, at 1680x1050 with Ultra High settings, changing the AA from 2x to 4x will yield drops to 40fps in many places.

I am really not sure what is holding this system back. I've done all basics e.g. closing all background programs, virus/spyware checking, defragging etc.
May 21, 2007 12:19:36 PM

Did you enable multi-cpu in advanced settings, maybe that will help?
I just ran quake 4 on my new system and it never gets below 60 (can't disable v-sync though) on ultra at res 1680*1050

Specs are:
E6700 @ 3.333
G.skill @ 667 4-4-4-12-1T
P5N32e-sli plus
EN8800GTX (default clocks)
Creative X-Fi Fatality
Galaxy 1000
Thermaltake Shark

Like i said , absolutely no slowdowns but due to the v-sync thingie i can't see max FPS.

btw I use the same version drivers so no difference there\
May 21, 2007 12:45:36 PM

Quote:
Your 8800 GTX cant do Quake 4 @ 1680x1050 with 2xAA?

Uh.. do u have an Athlon XP? You should be getting 60+ at 1920x1200 with 4xAA.


Alas if i could get that kind of performance...what a dream. That's why i got this card in the first place lol. My specs are:

Windows XP Pro SP2
Intel Core 2 Duo E6600 @ 2.93 GHz
EVGA 8800GTX 768MB 630/2060 with Forceware 158.22
2GB DDR2 667
Creative Audigy 2 ZS
Gigabyte GA-965P-DQ6
Enermax Galaxy 1000W
Antec 900

It's quite a good system but Quake 4 seems to croak. For example, at 1680x1050 with Ultra High settings, changing the AA from 2x to 4x will yield drops to 40fps in many places.

I am really not sure what is holding this system back. I've done all basics e.g. closing all background programs, virus/spyware checking, defragging etc.

dude, thats the craziest thing ive ever heard.
what abt drivers?
May 21, 2007 1:12:15 PM

Quote:
I'm exactly the same as you mate. 1680x1050 with a single GTX. My monitor is a 22" widescreen.

I started up Quake 4 the other day - i play at Ultra settings with 2xAA and 4xAF. I was quite disappointed that FPS could drop below 60 at those settings - my old 7950GX2 could to better :cry:  .

I thought getting a second GTX could solve all my performance woes...but now i'm not so sure...the other concern i have is how well they will do in DirectX 10 as these newer games are very close at hand.

Maybe better to wait for the second generation of DirectX 10 cards that should perform a lot better. In my experience games like Crisis are made with the next generation hardware in mind so that they evolve into their prime. If history repeats itself (i'm thinking of Oblivion), the 8800s will most likely struggle quite frequently.

What are your thoughts on this?
There's definitely something wrong with your system. Did you use driver cleaner before installing your 8800GTX? I was using 158.22 until I noticed it was causing artifacts to appear in Oblivion, but they're not there in 97.92.
May 22, 2007 4:06:10 AM

Quote:
Did you enable multi-cpu in advanced settings, maybe that will help?
I just ran quake 4 on my new system and it never gets below 60 (can't disable v-sync though) on ultra at res 1680*1050


Mine should be able to do that too then. Were you running at 4x AA? I know Quake 4 performance drops quickly with AA enabled.

I am using 158.22 and i always use driver cleaner before installing new drivers. In other games this card performs really well - it's just games using that Doom 3 engine that get picky with 4x enabled.

I have the current drivers for my mobo as far as i'm aware. I don't know if Gigabyte has drivers on their website.
May 22, 2007 9:48:49 AM

I've been doing a little more testing with AF and AA forcing from the driverpanel as opposed to ingame and at AF 16x and AA8Q and above i am getting slowdowns in heavy area's with lots of enemies. Framerate will drop to about 45 to 50 fps so... I had to do some digging in the driver panel to find the right settings, it appears that in dutch antialiasing is called "interferentie"
What's up with not being able to set it ingame though, is that a combination thing like oblivion with it's HDR settings or what?
anyways, there go my bragging rights :D 
May 23, 2007 11:18:04 AM

Quote:
I've been doing a little more testing with AF and AA forcing from the driverpanel as opposed to ingame and at AF 16x and AA8Q and above i am getting slowdowns in heavy area's with lots of enemies.


Can you try seeing if you get those drops on Ultra High settings with 4xAA? That's where i get them. It seems that Quake 4's performance takes a nose-dive with AA enabled. Quite disappointing...
May 23, 2007 4:32:19 PM

`Hi, did some more testing fiddling with settings and settled for:

control panel driver overrides Force 16x AF
Force 4x AA
Mipmaps none
No transparent AA (SS or MS)

Quake 4 settings Ultra quality
1680*1050
V-sync on (improves IQ removes banding)

With these settings i have best Image Quality and absolutely no slowdowns even in heavy areas with alot going on onscreen. There do seem to be some harddrive issues every now and then where i get freezes (very short) right before things are about to go down but haven't really looked into that yet.
Could that have something to do with my new mobo's sata controller? I've noticed that after the rebuild, my harddrive has slowed down during Diskkeeper defrag. It's just the game disk though, the drive where windows resides is still as fast as it was .Will have to look into that some more. Framerates don't go down according to FRAPS so it's really a sort of freeze or stutter brought on by something else. Anyways... i think you should be able to do 4x AA and 16xAF if not check your nvidia control panel to see if there are any settings that put extra strain on the cards like transparent AA and trilinear filtering and such. I've disabled those and did not notice any degraded quality. My eyes do suck though 8O
May 23, 2007 10:03:48 PM

Hi,

Thanks for the help. Sadly, these are the results i get when using 4xAA from the ingame settings menu:



The quality is nice but i was only getting 44 fps. I tried forcing 4xAA through the control panel and turning it off ingame but for some reason it wasn't forcing 4x properly. I might try reinstalling the drivers. Which drivers are you using? I am currently using 158.22.

Also, what's your current version of Quake 4? At the moment i'm using 1.3 and maybe that has something to do with it?[/img]
May 23, 2007 10:42:48 PM

Same driver and quake version. Although i did download two sets of version 158.22. The first one wasn't WHQL certified while the second was.
Haven't noticed performance differences on F.E.A.R and Halflife 2, but didn't have Quake installed at the time of the first driver install.
I have to say though that half the time i can't even see the difference between AA on or off unless it's a really crude game engine. Fortunately the opneGL engines are pretty slick allready though it's kind of disappointing to see a game this old not being able to run full options on these rather expensive cards! :?
May 24, 2007 2:01:52 AM

Yea i was really disappointed that the 8800GTX struggles with Quake 4 with AA at 4x. For me, i notice i big difference in quality between 2x and 4x. Oh well, i guess maybe SLI is the only way...it annoys me that i should have to do that though.

But again, i don't think it's worth it. With DX10 games so close, i'm sure something more suitable will debut soon.
!