Why Multi GPU dont scale in every game?

Tech_TTT

Notable
Apr 4, 2017
532
0
1,060
Hi,

I need to understand why they dont design multi GPU to scale with any game possible by just dividing the screen into 2 parts and drawing each part using a different GPU ?

This theoretically would give exactly perfect scaling with GPU ... and will also allow unlimited scaling the more GPU you add the more performance you get? by splitting the resolution between the number of GPU ?

Given that the CPU is not a Bottleneck , Why is this way never used ?
 
Solution


3dfx implemented SLI vaguely like that. Each GPU did half of the scan lines, but there were problems. Image tearing was common, and even then there wasn't a full 100% increase in performance. Right now due to software and hardware limitations there isn't a way to perfectly implement a multi-GPU configuration that gets double the performance of a single GPU on all titles.

Rogue Leader

It's a trap!
Moderator
Without getting too technical thats just not how it works. The second card works off the first card and thats somewhat of a bottleneck which is why some games get a 50% boost, some 80%, but never double. Secondly the game (or the drivers) needs to support it, so you have some losses in that software abstraction layer.

Needless to day the main reason is because the fastest way to process the data is to be on the same silicon, ANYTHING introduced in between is going to be a bottleneck.
 

USAFRet

Titan
Moderator
Its a bit more complex than that.

1. They also have to code the game for single GPU systems. So this increases dev time ($$$).
2. You can't just dedicate 1/2 the screen to GPA A, and 1/2 to GPU B. What happens when there is a 0.5 microsecond difference in delivery to the screen halves?
3. How would GPU B deliver to the screen? Another cable? That won't work.

They have chosen to just let the 2 GPUs work together, and one delivery to the screen.
Presumably, in the early days of multi GPU dev, someone came up with exactly your idea. And it was tossed in favor of how they do it now.
 

Tech_TTT

Notable
Apr 4, 2017
532
0
1,060


Yes two cables .. redesign the monitors as well ..

Monitor Mode one : Full screen one cable . Mode two : 2 half screens two cables ... why not ?

as for difference of delivery between screens , it is solved we already use 3 screens set up and there are no shifting between them.
 

Tech_TTT

Notable
Apr 4, 2017
532
0
1,060


should be left and right split then .. the perfect split.
 

TJ Hooker

Titan
Ambassador

I don't really see how this aspect is different than current SLI/Xfire (using alternate frame rendering). Each GPU takes turns rendering a frame, and all frames are sent to the monitor over a single output on the primary GPU to the monitor.
 

Tech_TTT

Notable
Apr 4, 2017
532
0
1,060


Then odd and even vertical lines ... odd on one screen and even on another.
 


What about where the shadow of an object on the left hits the right etc. managing those edge cases is tricky, vs AFR where each draws the whole screen alternately there are no edge cases as there is no edge.
 

TJ Hooker

Titan
Ambassador
AFR has been so strongly promoted in multi-GPU systems because it yields the highest potential performance benefit. SFR was an adequate alternative several years ago when games were not using such advanced rendering techniques, but now that we see geometric tessellation and complex shading effects becoming much more common, the pitfalls of screen-portioned rendering (split-frame, scissor frame, supertiling, etc.) become a lot more pronounced. Overdrawing is the biggest problem here; all of the vertices for scene geometry have to be transformed by each GPU even if they are not within the GPU's assigned region, meaning geometry performance cannot scale like it does with AFR, and any polygon between multiple rendering regions has to be fully textured and shaded by each GPU whose region it occupies, which is wasteful. Of course, there are also complications that can rise from inaccurate workload allocations.
https://forums.geforce.com/default/topic/527523/sli/modern-sfr-split-frame-rendering-compatabilty/post/3730397/#3730397
 

Tech_TTT

Notable
Apr 4, 2017
532
0
1,060


Thats the Game Engine problem to deal with ... in case of 2 GPU ...

Looking at the split screen , I have better idea , even and odd vertical lines .. ODD on one card and Even on another.
 

Rogue Leader

It's a trap!
Moderator


Again latency. Separate pieces of silicon. There is NO way to get 100% throughput and have both cards in sync.
 


3dfx implemented SLI vaguely like that. Each GPU did half of the scan lines, but there were problems. Image tearing was common, and even then there wasn't a full 100% increase in performance. Right now due to software and hardware limitations there isn't a way to perfectly implement a multi-GPU configuration that gets double the performance of a single GPU on all titles.
 
Solution

Tech_TTT

Notable
Apr 4, 2017
532
0
1,060


but this still applies to our 3 monitors gaming setup .. they follow the lowest fps of the 3 screens and sync the other two with it, no ?
 

Tech_TTT

Notable
Apr 4, 2017
532
0
1,060


Okay
 

Rogue Leader

It's a trap!
Moderator


Except that following the lowest fps you don't get the "perfect scaling" you are looking to attain. It is impossible.
 

TJ Hooker

Titan
Ambassador

The display for all three monitors is rendered as a single image I believe. So all monitors will always have the same fps, no need to synchronize.
 
I seem to remember when sli first appeared it used to split each frame across both gpu's. You used to be able to have a bar up the side showing how much of each frame was drawn by each gpu, so in games with lots of blue sky the top gpu could draw 2/3 of the image and the second gpu would draw 1/3. They clearly moved away from this approach, I seem to think because of lag and stuttering but its so long ago I cant really remember.
 

USAFRet

Titan
Moderator


Right.
It's not like, in the process of designing and developing the current process, they completely ignored any and all other possible methods.

No...We tried A, we tried B, we tried C, we tried D.....C works the best, let's go with that.
Or....B was the best...now we have found something better with C.
If your personal favorite concept is B...oh well.
 

Latest posts