Sign in with
Sign up | Sign in
Your question

multi gpu vs single gpu

Last response: in Graphics & Displays
Share
March 20, 2008 4:20:06 AM

I was just looking at the 9800 gx2 benchmarks and I couldn't help to notice that it isn't as fast as it could possibly be. Just before the 9800 gx2s were released, the CEO of Nvidia stated that in order to get the best performance from a graphics processing unit, you needed to have one gpu per video card. I don't understand the logic behind that. When AMD came out with their dual cpu quad fx platform, performance was inferior to the kentsfield, which had 4 cores in a single processor. Wouldn't the same be true for a gpu? Is there something that I am not seeing or is it just that nvidia wants to milk people for their money?

More about : multi gpu single gpu

a c 130 U Graphics card
March 20, 2008 9:24:50 AM

mikekazik1 said:
I was just looking at the 9800 gx2 benchmarks and I couldn't help to notice that it isn't as fast as it could possibly be. Just before the 9800 gx2s were released, the CEO of Nvidia stated that in order to get the best performance from a graphics processing unit, you needed to have one gpu per video card. I don't understand the logic behind that. When AMD came out with their dual cpu quad fx platform, performance was inferior to the kentsfield, which had 4 cores in a single processor. Wouldn't the same be true for a gpu? Is there something that I am not seeing or is it just that nvidia wants to milk people for their money?


When you say the card isnt running as fast as it could im assuming you mean that a single card is running faster clocks etc ? If this is the case then my understanding of it is that they throttle them back to releave the heat issues that you will get with two cards running side by side at full tilt. This is what i beleive the CEO was getting at, as you can cool a single better than you can a dual card solution then it can run faster. A dual core CPU is differant to a card like the X2 which has two seperate chips, They are still basically a crossfire set up and as such still rely on driver profiles which is also a reason for arguing that a single card is always better.
Mactronix
a b U Graphics card
March 20, 2008 10:57:46 AM

Exactly, you cannot tune/crank everything up full tilt to a GPU's fullest capability. You have to make some concessions to get everything to fit and work together on 1 card.
Related resources
March 20, 2008 3:44:20 PM

But you guys still didn't answer my question. Why wouldn't it be easier to cool a dual (or even quad) core gpu? Intel and AMD discovered that it was more efficient to have one processor with multiple cores. Why isn't the gpu industry doing the same thing?
March 20, 2008 3:51:26 PM

mikekazik1 said:
But you guys still didn't answer my question. Why wouldn't it be easier to cool a dual (or even quad) core gpu? Intel and AMD discovered that it was more efficient to have one processor with multiple cores. Why isn't the gpu industry doing the same thing?


its b/c Nvidia [whoever has the final decision] is being an A-hole right now, they are fooling themselves into believing something that simply isn't true.

granted their products are better clock per clock, stream processor for stream processor, compared to ati, but it seems they might just pull an AMD, which I hope they don't, and it is all speculation b/c I'm just not sure where they are going for the next gen cards, but time will tell...
a b U Graphics card
March 20, 2008 3:56:06 PM

Theyre heading in that direction, its just now that gpus have had to do this, as were reaching a point where 1 di cant be shrunk small enough to compensate for heat/cost production. With cpus, they hit that mark a year and a half ago or thereabouts. Larrabee and fusion (Intel and AMD) will sport this new trend, and well see anentirely different structure with our graphics solutions, but that still 2 years out, so for now, we are stuck with wringing out the best from single gpus, or using dual sli/cf types for more performance, tho with terrible scaling issues
March 20, 2008 3:56:18 PM

FrozenGpu said:
its b/c Nvidia [whoever has the final decision] is being an A-hole right now, they are fooling themselves into believing something that simply isn't true.

granted their products are better clock per clock, stream processor for stream processor, compared to ati, but it seems they might just pull an AMD, which I hope they don't, and it is all speculation b/c I'm just not sure where they are going for the next gen cards, but time will tell...


I understand, but even ati is doing the same thing. As of now, nvidia is using the 65nm process for their gpus. ATI is using the 55nm process. Why wouldn't ati have incentive to go dual or even quad? With that architecture, I think it would be well within their reach.
March 20, 2008 3:58:14 PM

JAYDEEJOHN said:
Theyre heading in that direction, its just now that gpus have had to do this, as were reaching a point where 1 di cant be shrunk small enough to compensate for heat/cost production. With cpus, they hit that mark a year and a half ago or thereabouts. Larrabee and fusion (Intel and AMD) will sport this new trend, and well see anentirely different structure with our graphics solutions, but that still 2 years out, so for now, we are stuck with wringing out the best from single gpus, or using dual sli/cf types for more performance, tho with terrible scaling issues



Thanks jaydeejohn. That's what I was asking about.
a b U Graphics card
March 20, 2008 3:58:36 PM

I think at 45 nm youll see it, but probably not before
March 20, 2008 4:00:04 PM

JAYDEEJOHN said:
I think at 45 nm youll see it, but probably not before


Why the holdup? I understand that they are having trouble shrinking it, but what makes a gpu so much harder to shrink than a cpu?
a b U Graphics card
March 20, 2008 4:02:39 PM

What Im somewhat concerned about is, with Intel and M$ getting together working on trying to get multithreaded aps/programs et al , Im hoping the future of graphics isnt influenced by the way its meant to be played, Intel style
a b U Graphics card
March 20, 2008 4:04:59 PM

Its not the smaller process, its working with multithreaded games, which there are few of, and those that are are barely doing it. Its a software solution first, so that the hardware can take advantage of it
March 20, 2008 4:05:54 PM

How would the gpu industry be influenced by something that mostly involves the cpu (ex. multithreaded apps)?
a b U Graphics card
March 20, 2008 4:10:50 PM

Its all code. Just look at CPUs. How many apps or programs actually use 4 cores? I know that gpus are parallel, but even so, theres some things that cant be done, thus the need for a cpu. Raytracing is an example. No need for a gfx card at all.
March 20, 2008 4:12:49 PM

JAYDEEJOHN said:
Its all code. Just look at CPUs. How many apps or programs actually use 4 cores? I know that gpus are parallel, but even so, theres some things that cant be, thus the need for a cpu. Raytracing is an example. No need for a gfx card at all.


But I think that any attempts that intel and micro$oft make to multithread their apps will encourage the gpu industry to get on its feet and deliver the next generation of cards.
a b U Graphics card
March 20, 2008 4:17:10 PM

Thats true, but what if they lean towards a completely cpu direction? Like raytracing? Forgetting some good things such as tesselation? If youre going to lead the way, why not go in the direction where youre already set?
March 20, 2008 4:30:45 PM

Err... too many people in this thread do not seem to understand how a GPU works. For argument's sake, a GPU is already a multicore CPU. Each ROP, TMU, and SPU is essentially another core. You do not need to add multiple GPU cores to make it more powerful. Simply increasing the ROP, TMU, and SPU could would yield a greater result as you would not have to deal with the issues related to SLI and multi GPU setups in general. If they were to make a true "9800" series card that simply had the combined power of the 256 SPUs, 32ROPs, and 128 TMUs, with a 1GB memory pool, and 512-bit bus, we'd see a much faster card than Nvidia is currently releasing.
a b U Graphics card
March 20, 2008 4:36:02 PM

True, but then how large would that be?
a b U Graphics card
March 20, 2008 4:44:58 PM

My point is, we are now reaching the limits of how many transistors we can put on a di that can carry all that you said, and still be cool enougn and efficient enuff, and by linking smaller gpu cores that that solves the heat/production cost issues. The OP was asking why dont we do this yet? Im just telling him that we will eventually, and its already in the wind
March 20, 2008 7:09:39 PM

JAYDEEJOHN said:
My point is, we are now reaching the limits of how many transistors we can put on a di that can carry all that you said, and still be cool enougn and efficient enuff, and by linking smaller gpu cores that that solves the heat/production cost issues. The OP was asking why dont we do this yet? Im just telling him that we will eventually, and its already in the wind


yeah that or with the die shrinks getting smaller, they might simply be able to fit it all on one die, kind of like what heyyou27 was talking about.

And besides, CPU's are not like GPU's, so making them into dual or quads is not necessary and isn't feasible at this point, or Dee de dee, they would be doing it right now...

The most cost effective solution for each tier of the consumer market will be addressed in this manner, and in this manner only!

So, NO SOUP FOR YOU!!!

[:kentuckyranger:1] :lol: 
[:kentuckyranger:1] :lol: 
:pfff: 
a b U Graphics card
March 20, 2008 8:54:19 PM

JAYDEEJOHN said:
Theyre heading in that direction, its just now that gpus have had to do this, as were reaching a point where 1 di cant be shrunk small enough to compensate for heat/cost production. With cpus, they hit that mark a year and a half ago or thereabouts. Larrabee and fusion (Intel and AMD) will sport this new trend, and well see anentirely different structure with our graphics solutions, but that still 2 years out, so for now, we are stuck with wringing out the best from single gpus, or using dual sli/cf types for more performance, tho with terrible scaling issues

OK OK But I did say 2 years out, allowing for smaller processing/dishrink. Now wheres my soup?
March 20, 2008 10:44:14 PM

no.. the Op doesn't get any soup.... :lol:  :lol:  :lol: 
a b U Graphics card
March 20, 2008 10:45:12 PM

LOL so sad.....
!