SLI/CF a thing of the past?

My concern is in the differences in rendering. Fact is, while minor, there are differences between how ATI and NVIDIA render the same image. I also worry about advanced features like AA, as the two vendors use different methods for the same effect.

I see a lawsuit waiting to happen here; Intel did license SLI/CF for X55, but I doubt either vendor wants to have to deal with the support and lost profits of letting users mix and match. I also find the timing suspect, considering Larrabee is right around the corner...
 

thefox14

Distinguished
Jan 16, 2009
92
0
18,630
I dont see how its possible to really sue them for what they are doing. Sure if they were using the same process as ATI or Nvidia to render an image with more than one GPU, but they aren't. In fact their process is completely different. The Hydra chip basically calculates the workload and splits it evenly between the GPU's. It's also able to determine if one GPU is more powerful than the other, thus giving more work to that GPU.

Now from what I know, the chip uses the ATI's/Nvidia's drivers to use the cards. What I could see happening is ATI or Nvidia blocking them from using their drivers, but we'll have to wait and see.

I've been following this chip for quite sometime now and I cant wait for it to be released. I mean, if it lives up to what they are claiming, GPUs should scale near linear. 50fps in crysis with one 4870 would get 100fps with 2. 4870 x 3870, or 6600GT x GTX280 completely possible.
 
The timing is suspect, as Intel Im sure has its own plans.
My guess is, each card does what it does, or uses its own approach to render, while hydra just calculates which ones do which best, apply them to that usage, and works on down the line, until each card is rendering to its ability.
In a way, this is better, if it is really being done this way, as each company has their own strengths. One card has poor memory management, and slower memory, that card does other things.
What this really does? It wipes out TWIMTBP, which plays into the hands of both ATI and of course Intel, where it lessons both nVidia and ATI to a much lessor extent. A great example is found here:
http://forum.beyond3d.com/showthread.php?p=1320641#post1320641
The discussion is why does the new Batman Arkhum Asylum not have MSAA capable on its new Demo for ATI cards. nVidia totally ignores ATI, working exclusively with the devs at this point, and ATI cards cant use it unless you use a workaround for it. In otherwords, its TWIMTBP at its finest, screw ATI, get in early and tweak the game for nVidia.
And from my link, a typical way ATI does business:
"Originally Posted by Kaotik
And still, giving money and/or help shouldn't mean blocking the other vendor from using features they're capable of

Indeed, it doesn't. One case that I happen to know about was that on securing a co-marketing agreement the first thing the ISV engineers did was take a look at the game code and suggest optimizations that benefitted all DX10 cards, before going on to suggest further enhancements for DX10.1. Would have been very easy to wrap all of those changes purely to the DX10.1 path, but this way was better for the title as a whole."
Now, ATI could have just worked on its own path with the devs, but it also wouldnt have been good for the game overall, for the majority of users out there.
So, getting back to my hypothesis, Intel ruins TWIMTBP approach by working around it, using nVidia when it wants with Hydra, and ATI when it wants for anythting else.
And, like I said, nVidia certainly has more to lose with their somewhat suspect more narrow approach compared to ATIs more broader approach
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
This would be amazing if it is really true you can mix and match. You can finally get one of the faster nvidia cards and then a cheaper ATI for a boost rather then being forced into 2 GTX 295 or something like that. It is one of the things I like most about ATI, being able to CF any card of the same series means I can get a 4890 and then throw in a 4850 if I need a bit more power for cheaper.

However I have my doubts weather this will really work and if it does how it will scale the two cards. It might come out weaker then just SLI or CF'ing cards and weather or not this will end in a court room somewhere. It's a bold move thouhg I will give them that.
 
Since some people insist on ATI to do more in the regards of dev relations for games etc, Im pointing out, this tool, owned/backed by Intel destroys most of nVidias influence in those games here on forwards. Like I said, theres a new playah in town now, and nVidia will no longer be able to have it their TWIMTBP way.
This gives Intel easy leverage here, which hardly effects ATI at all, and actually will propel the usage of DX10.1 and DX11 further down the road in a wider way.
Its good for gaming, great for Intel, and hurts nVidia.
If you take the ATI should do more approach, then maybe Creative should lock out all things not Creative, Intel lock out AMD and vice versa etc etc etc
Some things to think about
 
Old influences? Please, DualGPU is still an "extra" that most games don't even fully support upon release. Its a niche market.

You also forget that ATI and NVIDIA render differently, AA being the primary (and most notable) example. You also are limited to the capabilites of the lesser of the two cards (if one is DX10 and the other DX11, you are stuck at DX10). And please, don't try to argue that hydra could send the DX11 functions to the DX11 card; it simply won't be able to work like that.

I see total fail on this, from both a legal and actual use perspective. To argue this would change how devs work with ATI/NVIDIA is outright insanity.
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
I think he means how Nvidia has spent a long time slipping money into people's pockets to get their games to lean developed around their cards so they work best with their line up (I.E.far cry 2 and Crysis) unless I misunderstood him.
 
Yes darkvine, thats exactly what I meant, and was very obvious by following what I wrote.
Follow my links, then youll understand. ATI devs make the game DX10 compatible first, then go and suggest DX10.1.
Whereas, Batman, they shun AA support across the board, and have it only working on nVidia.
The SLI pull in gamer is trying to do? Not sure, but diminishing something that may become much more important if this tech actually works is counter productive, much like not having DX10.1 or giving Batman arkhum full AA usage etc etc etc
 
Therell be no stopping ATI rendering for its set requirements by hydra, as well as nVidia cards. its just that hydra chooses which is best at what, and allocates accordingly.
There wont be partial usage, so driver/rendering wont be an issue.
Look at my link from peddie, he describes the front end, back end approach quite well