Sign in with
Sign up | Sign in
Your question

SLI/Crossfire threading?

Last response: in Graphics & Displays
Share
November 11, 2009 12:09:28 PM

I am currently running an intel Q6600 with a single 8800GT. Most of the games I run right now seem to be CPU clock limited, since the Q6600's clock isn't that great (2.4 GHz). I've gotten some gains from OCing. None of the games use all 4 CPU cores.

I have a friend that has a spare 8800GT that I've thought about using for SLI. I've heard that SLI is easily bottlenecked by the CPU because of additional I/O, but I don't know if the additional CPU work can be offloaded to the idle CPU cores. Does it perhaps depend on the game?

So my question is: How do multi GPU solutions (SLI/Crossfire) scale with CPU cores? Is having a quad core the same as having a dual core?
November 11, 2009 12:18:15 PM

The reason y the games dont use 4 cores is that they are not 64-bit apps....only when the game or an app is 64-bit then it will use multiple cores.....quad cores have advantages in more utility applications like video encoding, rendering, etc....but as far as gaming is concerned they give limited use like expanded technology and faster transfers and installs....considering ur CPU and GPU u can SLI easily...the only bottleneck to worry will be the power supply.....
November 11, 2009 12:22:42 PM

ridic23 said:
The reason y the games dont use 4 cores is that they are not 64-bit apps....only when the game or an app is 64-bit then it will use multiple cores.....quad cores have advantages in more utility applications like video encoding, rendering, etc....but as far as gaming is concerned they give limited use like expanded technology and faster transfers and installs....considering ur CPU and GPU u can SLI easily...the only bottleneck to worry will be the power supply.....


That's rubbish, the reason is that most apps haven't been coded to utilise multiple cores. You can get 32bit apps that work with 8 cores, or 64bit apps that work with 1. It's just down to the coding.
Related resources
November 11, 2009 12:25:39 PM

can you name a 32 bit app that uses all cores and a 64 bit app that uses only 1.....
a b U Graphics card
November 11, 2009 12:26:01 PM

Komma said:
I am currently running an intel Q6600 with a single 8800GT. Most of the games I run right now seem to be CPU clock limited, since the Q6600's clock isn't that great (2.4 GHz). I've gotten some gains from OCing. None of the games use all 4 CPU cores.

I have a friend that has a spare 8800GT that I've thought about using for SLI. I've heard that SLI is easily bottlenecked by the CPU because of additional I/O, but I don't know if the additional CPU work can be offloaded to the idle CPU cores. Does it perhaps depend on the game?

So my question is: How do multi GPU solutions (SLI/Crossfire) scale with CPU cores? Is having a quad core the same as having a dual core?


It all depends on the game. Each game utilizes a fixed number of cores for generating data for the GPU, AI and more. Most of the games which are 2 years old and some new ones utilize only 2 cores. Some games take advantage of all cores but most of them are more recent.
Concerning multi-GPU solutions. Because some games utilize a fixed number of cores, so adding another card for SLI/X-fire will have NO effect on the number of cores utilized and most possibly the CPU will become a limiting factor unless it has over 3 GHz. This is due to the fact that GPUs wait for the CPU to supply them with data for 3d rendering. More GPUs means more CPU work, but because the number of cores are limited, the only way to make the GPUs not waste time waiting is to over-clock the CPU or increase the resolution and game detail (forcing the GPUs to spend more time generating the images, time in which the CPU can catch up).
a b U Graphics card
November 11, 2009 12:39:56 PM

ridic23 said:
can you name a 32 bit app that uses all cores and a 64 bit app that uses only 1.....


Sony Vegas Pro 9 (32 bit) - uses 4 cores (maybe more). And the 64 bit application that can use 1 core only I can write in 1 minute using Visual Studio :kaola:  . It is all down to coding. That I can tell you from my experience as a .Net programmer.
November 11, 2009 1:00:11 PM

hallowed_dragon said:
It all depends on the game. Each game utilizes a fixed number of cores for generating data for the GPU, AI and more. Most of the games which are 2 years old and some new ones utilize only 2 cores. Some games take advantage of all cores but most of them are more recent.
Concerning multi-GPU solutions. Because some games utilize a fixed number of cores, so adding another card for SLI/X-fire will have NO effect on the number of cores utilized and most possibly the CPU will become a limiting factor unless it has over 3 GHz. This is due to the fact that GPUs wait for the CPU to supply them with data for 3d rendering. More GPUs means more CPU work, but because the number of cores are limited, the only way to make the GPUs not waste time waiting is to over-clock the CPU or increase the resolution and game detail (forcing the GPUs to spend more time generating the images, time in which the CPU can catch up).


That doesn't even make sense...

If you have 1 guy named CPU, who can carry 100 lbs of ore to a forge, and no more.
And then you have one guy named GPU who can turn 500 lbs of ore into metal, and no more.
Then making the one guy make a special "high res" ore where he can now only turn 100lbs of ore into metal, and no more (bc special "high res" ore takes longer)
It still doesn't stop the fact that the guy doesn't have enough metal to supply the army with "smooth gameplay" because there isn't enough ore in the first place.


In other words, making the GPU work harder so it doesnt "waste time" has absolutely nothing to do with performance when the CPU didn't change at all. The basic logic just seems... illogical.
Unless you specifically said it just so you don't have a "lazy gpu" because you believe he should be working very hard even if he doesn't need to :p 


But that's to say that any of this matters!
CPU's don't bottleneck GPU's.

It's all determined by the video game software. Some video games use the CPU (because they're poorly programmed or archaic in design, like Everquest 2- or just cpu intensive like SupCom) but that has nothing to do with the GPU being "bottlenecked" It means the overall system is bottlenecked by the CPU, not the GPU.

But most video games run perfectly fine even on "crappy" CPU's. It's the GPU which bottlenecks itself. Which means there is no bottleneck besides x16 AA or too high settings.


The whole "Will my CPU bottleneck my GPU?" is a myth.
This literally will never happen. Ever.
Your CPU can bottleneck your gaming because the SOFTWARE is crappy, but the GPU has next to nothing to do with the CPU to Software relationship in reality. As in, GPU:CPU has nothing more to do with the CPU than the DVD drive has to do with gaming performance in relation to DVD:CPU.

To say, "The CPU bottlenecks the GPU!" you might as well say "The DVD drive bottlenecks the GPU!" or "My brain is made of candy. SWEET! SWEET! CANDAY!!!!!!!!!!!"



Of course, I have no idea what I'm talking about. I won't even be able to talk about CPU performance in gaming until Friday, when I get my new i7 and can compare it to my Core2Duo.
November 11, 2009 1:08:40 PM

Komma said:
I have a friend that has a spare 8800GT that I've thought about using for SLI. I've heard that SLI is easily bottlenecked by the CPU because of additional I/O


What are you even talking about?

This is the equivalent of saying "I've heard that SLI is easily bottlenecked by the CPU because of additional magic gremlins which come to life inside my case while I'm asleep."


SLI/Crossfire is never bottlenecked by the CPU.
It's bottlenecked by the SOFTWARE and it's poor performance programming.
a b U Graphics card
November 11, 2009 1:09:38 PM

nitros85 said:
That doesn't even make sense...

If you have 1 guy named CPU, who can carry 100 lbs of ore to a forge, and no more.
And then you have one guy named GPU who can turn 500 lbs of ore into metal, and no more.
Then making the one guy make a special "high res" ore where he can now only turn 100lbs of ore into metal, and no more (bc special "high res" ore takes longer)
It still doesn't stop the fact that the guy doesn't have enough metal to supply the army with "smooth gameplay" because there isn't enough ore in the first place.


In other words, making the GPU work harder so it doesnt "waste time" has absolutely nothing to do with performance when the CPU didn't change at all. The basic logic just seems... illogical.
Unless you specifically said it just so you don't have a "lazy gpu" because you believe he should be working very hard even if he doesn't need to :p 


But that's to say that any of this matters!
CPU's don't bottleneck GPU's.

It's all determined by the video game software. Some video games use the CPU (because they're poorly programmed or archaic in design, like Everquest 2- or just cpu intensive like SupCom) but that has nothing to do with the GPU being "bottlenecked" It means the overall system is bottlenecked by the CPU, not the GPU.

But most video games run perfectly fine even on "crappy" CPU's. It's the GPU which bottlenecks itself. Which means there is no bottleneck besides x16 AA or too high settings.


The whole "Will my CPU bottleneck my GPU?" is a myth.
This literally will never happen. Ever.
Your CPU can bottleneck your gaming because the SOFTWARE is crappy, but the GPU has next to nothing to do with the CPU to Software relationship in reality. As in, GPU:CPU has nothing more to do with the CPU than the DVD drive has to do with gaming performance in relation to DVD:CPU.

To say, "The CPU bottlenecks the GPU!" you might as well say "The DVD drive bottlenecks the GPU!" or "My brain is made of candy. SWEET! SWEET! CANDAY!!!!!!!!!!!"



Of course, I have no idea what I'm talking about. I won't even be able to talk about CPU performance in gaming until Friday, when I get my new i7 and can compare it to my Core2Duo.


Understanding CPU/GPU bottlenecks (lame comparisons included)

http://www.viperlair.com/articles/archive/editorials/bottlenecks.shtml
November 11, 2009 1:25:31 PM

hallowed_dragon said:
*From Link* A CPU bottleneck happens when the video card doesn't get enough info from the CPU.


A CPU bottleneck happens when the software gives out more information than the CPU can handle.

It's less about the GPU not getting enough information, as it is about the CPU getting too much information.

I would no less describe to people "CPU bottlenecks the GPU!" as to say to them "CPU bottlenecks the DVD drive!" or "CPU bottlenecks the RAM!"

In reality, when the CPU is overloaded with information by the Software, it will bottleneck EVERYTHING.
Just because the GPU is included doesn't mean it's accurate to say the CPU bottlenecks the GPU.

The most accurate way to say it would be, "The CPU bottlenecks itself." or "The Software bottlenecks the CPU, which in turns bottlenecks the entire computer." or in lamen terms "What a shitty program!!!"
November 11, 2009 1:26:50 PM

Woah, this thread got a lot more replies than I thought. Thanks everyone!

I guess I need to clarify a bit, since "bottleneck" really isn't a technical term.

What I've heard: "Multi GPU solutions give additional CPU overhead -> If your CPU is already kept at 100% either by coding or by hardware specs, you only lose performance if you add a GPU."

The ambiguity: What's the overhead? Additional I/O required of the CPU? Can it be run on a separate thread? Is it within the game, or within the drivers? There is no way of telling from benchmarks, since I haven't heard anyone that benchmarked different CPUs under a SLI reference rig.

The situation: My CPU is "bottlenecking" in the sense that either increasing the clocking speed, or swapping for a processor with better performance per core, will give me better performance.

The issue and question: If I get another GPU, one of the following will happen:
A) Increased overhead on the CPU core that is already taxed out, and so performance goes down
B) Additional work can be done on the cores that have nothing to do, and so performance goes up

What I'm understanding from replies so far is "Probably A, but depends on which game".

Is that right?
November 11, 2009 1:31:15 PM

Komma said:
Woah, this thread got a lot more replies than I thought. Thanks everyone!

I guess I need to clarify a bit, since "bottleneck" really isn't a technical term.

What I've heard: "Multi GPU solutions give additional CPU overhead -> If your CPU is already kept at 100% either by coding or by hardware specs, you only lose performance if you add a GPU."

The ambiguity: What's the overhead? Additional I/O required of the CPU? Can it be run on a separate thread? Is it within the game, or within the drivers? There is no way of telling from benchmarks, since I haven't heard anyone that benchmarked different CPUs under a SLI reference rig.

The situation: My CPU is "bottlenecking" in the sense that either increasing the clocking speed, or swapping for a processor with better performance per core, will give me better performance.

The issue and question: If I get another GPU, one of the following will happen:
A) Increased overhead on the CPU core that is already taxed out, and so performance goes down
B) Additional work can be done on the cores that have nothing to do, and so performance goes up

What I'm understanding from replies so far is "Probably A, but depends on which game".

Is that right?



Your performance won't go down if you add another GPU (unless the game specific is anti-gpu friendly, like an old game like Everquest 2)

If your CPU is truly what is hindering your gaming, then adding another GPU will do absolutely nothing, good or bad.

If your CPU is at 100% and your GPU is not, then adding a GPU does nothing, because the CPU is still slowing the system down.

For example, if CPU is at 100% and GPU at 30%, and you add another graphics card, your CPU is still at 100% and your GPU at 30% and your GPU2 at 0%. Or 15/15% (still only 30% GPU) if the game taxes both cards equally.



My only question is why you are wanting to get another GPU when your CPU is so crappy that it bottlenecks your system?
Even a "crappy" new CPU will work wonders enough to be fine in most games. And if all you're upgrading is your CPU/Motherboard, then you wouldn't have to spend very much more than you would just on another GPU.
!