Sign in with
Sign up | Sign in
Your question

SLI/CF a thing of the past?

Last response: in Graphics & Displays
Share
a b U Graphics card
August 11, 2009 9:39:36 PM

http://www.iopanel.net/forum/thread31404.html
Itll be interesting how well this works. And whatll we call it? SLFIRE? CrossLi?

More about : sli thing past

a c 271 U Graphics card
August 11, 2009 9:50:32 PM

The choice chip, hurrah! (I hope).
m
0
l
a b U Graphics card
August 11, 2009 10:04:06 PM

ugh oh, now i almost want to get a new rig (its only been 6 months)

though i think i could stick to just CF'd 4870 1GB

though this sounds like an interesting idea
m
0
l
Related resources
a b U Graphics card
August 11, 2009 11:24:23 PM

That didnt stop Intel before, why now?
m
0
l
a c 271 U Graphics card
August 11, 2009 11:24:30 PM

Quote:
Nvidia will never allow this surely.

Will they be able to prevent it from happening?
m
0
l
a c 271 U Graphics card
August 11, 2009 11:34:15 PM

I have no doubt that they are going to be narked but I still have to wonder how they are going to do anything about it?, unless it's tied into the decision to exclude ATi from PhysX in some way?
m
0
l
a b U Graphics card
August 11, 2009 11:34:52 PM

Theres no sli chip required, so no monies owed
m
0
l
a b U Graphics card
August 11, 2009 11:36:49 PM

Its rendered (possibly in a different way, havnt read up on all of it) on the Lucid drivers, doesnt need the sli chip that also requires the 5$ usage of it
m
0
l
a b U Graphics card
August 11, 2009 11:55:16 PM

Im thinking it has to be rendered on their HW, using thier chip, and even in their way. All of these approaches may not need be used, other than the cards. Its still too unknown at this time, but Im sure theyre very aware of nVidia
m
0
l
a c 235 U Graphics card
August 12, 2009 12:05:22 AM

Mousemonkey said:
Will they be able to prevent it from happening?


never underestimate the power of the darkside!! :sol: 


m
0
l
a b U Graphics card
August 12, 2009 1:20:34 AM

Great find jaydee, it'll be interesting to see how this plays out.
m
0
l
a c 169 U Graphics card
August 12, 2009 7:03:00 AM

Thanks for the link,well i have read about Lucid before and from what i read it seems to be a very effective,in one of the examples i read it said that with Lucid they managed to run Crysis with 60FPS solid with 2 9600GTs.
m
0
l
August 12, 2009 9:39:17 AM

^ unless you optimize the game for that exact setup, then it's possible.
m
0
l
a b U Graphics card
August 12, 2009 12:17:46 PM

My concern is in the differences in rendering. Fact is, while minor, there are differences between how ATI and NVIDIA render the same image. I also worry about advanced features like AA, as the two vendors use different methods for the same effect.

I see a lawsuit waiting to happen here; Intel did license SLI/CF for X55, but I doubt either vendor wants to have to deal with the support and lost profits of letting users mix and match. I also find the timing suspect, considering Larrabee is right around the corner...
m
0
l
August 12, 2009 3:59:08 PM

I dont see how its possible to really sue them for what they are doing. Sure if they were using the same process as ATI or Nvidia to render an image with more than one GPU, but they aren't. In fact their process is completely different. The Hydra chip basically calculates the workload and splits it evenly between the GPU's. It's also able to determine if one GPU is more powerful than the other, thus giving more work to that GPU.

Now from what I know, the chip uses the ATI's/Nvidia's drivers to use the cards. What I could see happening is ATI or Nvidia blocking them from using their drivers, but we'll have to wait and see.

I've been following this chip for quite sometime now and I cant wait for it to be released. I mean, if it lives up to what they are claiming, GPUs should scale near linear. 50fps in crysis with one 4870 would get 100fps with 2. 4870 x 3870, or 6600GT x GTX280 completely possible.
m
0
l
a b U Graphics card
August 12, 2009 4:12:55 PM

The timing is suspect, as Intel Im sure has its own plans.
My guess is, each card does what it does, or uses its own approach to render, while hydra just calculates which ones do which best, apply them to that usage, and works on down the line, until each card is rendering to its ability.
In a way, this is better, if it is really being done this way, as each company has their own strengths. One card has poor memory management, and slower memory, that card does other things.
What this really does? It wipes out TWIMTBP, which plays into the hands of both ATI and of course Intel, where it lessons both nVidia and ATI to a much lessor extent. A great example is found here:
http://forum.beyond3d.com/showthread.php?p=1320641#post...
The discussion is why does the new Batman Arkhum Asylum not have MSAA capable on its new Demo for ATI cards. nVidia totally ignores ATI, working exclusively with the devs at this point, and ATI cards cant use it unless you use a workaround for it. In otherwords, its TWIMTBP at its finest, screw ATI, get in early and tweak the game for nVidia.
And from my link, a typical way ATI does business:
"Originally Posted by Kaotik
And still, giving money and/or help shouldn't mean blocking the other vendor from using features they're capable of

Indeed, it doesn't. One case that I happen to know about was that on securing a co-marketing agreement the first thing the ISV engineers did was take a look at the game code and suggest optimizations that benefitted all DX10 cards, before going on to suggest further enhancements for DX10.1. Would have been very easy to wrap all of those changes purely to the DX10.1 path, but this way was better for the title as a whole."
Now, ATI could have just worked on its own path with the devs, but it also wouldnt have been good for the game overall, for the majority of users out there.
So, getting back to my hypothesis, Intel ruins TWIMTBP approach by working around it, using nVidia when it wants with Hydra, and ATI when it wants for anythting else.
And, like I said, nVidia certainly has more to lose with their somewhat suspect more narrow approach compared to ATIs more broader approach
m
0
l
a b U Graphics card
August 12, 2009 4:24:04 PM

This would be amazing if it is really true you can mix and match. You can finally get one of the faster nvidia cards and then a cheaper ATI for a boost rather then being forced into 2 GTX 295 or something like that. It is one of the things I like most about ATI, being able to CF any card of the same series means I can get a 4890 and then throw in a 4850 if I need a bit more power for cheaper.

However I have my doubts weather this will really work and if it does how it will scale the two cards. It might come out weaker then just SLI or CF'ing cards and weather or not this will end in a court room somewhere. It's a bold move thouhg I will give them that.
m
0
l
a b U Graphics card
August 12, 2009 4:56:28 PM

Since some people insist on ATI to do more in the regards of dev relations for games etc, Im pointing out, this tool, owned/backed by Intel destroys most of nVidias influence in those games here on forwards. Like I said, theres a new playah in town now, and nVidia will no longer be able to have it their TWIMTBP way.
This gives Intel easy leverage here, which hardly effects ATI at all, and actually will propel the usage of DX10.1 and DX11 further down the road in a wider way.
Its good for gaming, great for Intel, and hurts nVidia.
If you take the ATI should do more approach, then maybe Creative should lock out all things not Creative, Intel lock out AMD and vice versa etc etc etc
Some things to think about
m
0
l
a b U Graphics card
August 12, 2009 5:08:38 PM

Im just saying, this gives Intel tremendous leverage in breaking up old influences in gaming. They bought into this early on, so these has to be reasons
m
0
l
a b U Graphics card
August 12, 2009 7:40:05 PM

Old influences? Please, DualGPU is still an "extra" that most games don't even fully support upon release. Its a niche market.

You also forget that ATI and NVIDIA render differently, AA being the primary (and most notable) example. You also are limited to the capabilites of the lesser of the two cards (if one is DX10 and the other DX11, you are stuck at DX10). And please, don't try to argue that hydra could send the DX11 functions to the DX11 card; it simply won't be able to work like that.

I see total fail on this, from both a legal and actual use perspective. To argue this would change how devs work with ATI/NVIDIA is outright insanity.
m
0
l
a b U Graphics card
August 12, 2009 7:48:18 PM

I think he means how Nvidia has spent a long time slipping money into people's pockets to get their games to lean developed around their cards so they work best with their line up (I.E.far cry 2 and Crysis) unless I misunderstood him.
m
0
l
a b U Graphics card
August 12, 2009 8:08:48 PM

Yes darkvine, thats exactly what I meant, and was very obvious by following what I wrote.
Follow my links, then youll understand. ATI devs make the game DX10 compatible first, then go and suggest DX10.1.
Whereas, Batman, they shun AA support across the board, and have it only working on nVidia.
The SLI pull in gamer is trying to do? Not sure, but diminishing something that may become much more important if this tech actually works is counter productive, much like not having DX10.1 or giving Batman arkhum full AA usage etc etc etc
m
0
l
a b U Graphics card
August 12, 2009 8:12:54 PM

Therell be no stopping ATI rendering for its set requirements by hydra, as well as nVidia cards. its just that hydra chooses which is best at what, and allocates accordingly.
There wont be partial usage, so driver/rendering wont be an issue.
Look at my link from peddie, he describes the front end, back end approach quite well
m
0
l
a b U Graphics card
August 12, 2009 11:54:31 PM

Only if it doesnt actually do as it claims, I agree. Also, I dont like the tie ins/leverage it may have as well even if it does, but bottomline, I do favor progress
m
0
l
a c 169 U Graphics card
August 13, 2009 11:11:04 AM

Quote:
That is just bollocks, complete and utter bollocks unless you are talking about medium settings at like 12 x 10.

This is not magic, the capabilities of the cards are what they are.

Here is where i read it:
http://www.anandtech.com/video/showdoc.aspx?i=3385&p=3

"The demo we saw behind closed doors with Lucid did show a video playing on one 9800 GT while the combination of it and one other 9800 GT worked together to run Crysis DX9 with the highest possible settings at 40-60 fps (in game) with a resolution of 1920x1200. Since I've not tested Crysis DX9 mode on 9800 GT I have no idea how good this is, but it at least sounds nice."

Sorry my bad it was 9800GT not 9600GT:) 
m
0
l
a b U Graphics card
August 13, 2009 11:55:06 AM

I want to see a game that renders differently between an ATI and NVIDIA card (10.1 is a good test case, but I also want a game that supports AA, as AA effects are done differntly between the two vendors).

SLI'ing a 9800GT and a 9600GT does not impress me, as thats testing the best case scenereo.
m
0
l
!