Sign in with
Sign up | Sign in
Your question

Crossfire or SLI

Last response: in Graphics & Displays
Share
November 1, 2006 8:26:53 AM

I've been out of the hardware loop for far too long and have a question for those in the know.

What is the difference between a Crossfire setup and SLI? Are they basically the same with the only difference being ATI versus Nvidia?

Like I said, I've been out of the loop and I'm finding myself in a "what should I get for my next system" mud hole.

PS Like my last 2 systems, I'm building it myself

More about : crossfire sli

November 1, 2006 9:36:45 AM

To actually answer the question (mpilch, I'm looking at you ;) :p );

Crossfire and SLI are ATi and nVidia's implementation of multi-GPU rendering. They are not the same internally, but they do the same thing - accelerate graphics.

They are not compatible at all.

Don't get either for your next system! Stick with the best single card you can afford.
November 1, 2006 9:38:02 AM

Both are dual card setups.

Like mpilchfamily said, below 1600x1200 resolution, they're basically a waste of time and money.

And also, don't bother getting two crappy mid-range cards, get two top-line cards at the time of your system build, or you're just wasting money.
Related resources
November 1, 2006 9:57:02 AM

A waste of money with ATI you need to buy a crossfire master card and with nvidia you don't have to.
November 1, 2006 10:25:14 AM

Actually Fursecul, the new generation of ATI cards don't require a master. Crossfire is the superior solution IMO, SLI has its advantages though.
November 1, 2006 3:02:30 PM

Does this theory work? Lets say you hit a higher peak framerate, lets say 120fps, with a one card setup at 1280x720 in a game. With the sli setup you may hit only 100 b/c of the cpus extra load in this instance. But lets say the scene gets intense and really works the machine graphically, will the single card possibly dip to 30fps while the sli setup may not go below 50? I'm thinking this b/c the higher graphics load could now warrant a need for the extra graphics power.

These numbers are made up. They are only used for an example.

If this doesn't make sense, I will try to elaborate.
November 1, 2006 7:03:58 PM

Yes SLI has it's faults too
November 1, 2006 8:10:07 PM

Corresonding to the ATI crossfire setup. Double check your cards. I believe only the next X1950s and X1650 do not require a master card, but don't quote me. I have an ATI X1900XT 512 GDDR3 and it requires a master card to run crossfire. But seeing Vista on the horizon, I don't think there's any point in buying a dual card setup as someone else stated, if you don't play beyond 16x12 res.

Has anyone even gotten SLI or Crossfire to work on Vista?
November 1, 2006 8:28:01 PM

Quote:


Has anyone even gotten SLI or Crossfire to work on Vista?


I havent tried but I have my doubts it will work. both NVidia's and ATI's drivers have been well, subpar even for beta drivers. many features have been disabled for now, I havent been able to get OpenGL to work properly (Prey wont run at all) and with Vista RC2, I dont have the control center. I may be able to get that to work if I ignore the warning on their site and roll back to their "older" drivers... not to say that Vista itself is bad, quite the contrary. I have been running it since Beta 1 and have enjoyed watching it progress, I think it would be rather nice, if 3rd party support ever improves... was disapointed with XP x64 as its beta 3rd party support sucked during beta, and still sucks when it went gold. Hope this doesnt happen again. Creative is one of the worst driver offenders thou, disapointing considering their size.

sorry bout the hijack, didnt mean to go this longwinded. :D 
November 1, 2006 8:33:50 PM

My apologies as well as starting a hijack thread :oops: 

Both have their fields of glory.
a b U Graphics card
November 1, 2006 10:05:46 PM

Compositing chip does appear to be in the chip, no 3rd party on board chip.

The X1300, X1600, x1650 and the low end X1800GTO can do Xfire through the PCIe bus.

(it does seem to be feeding straight into the VPU, not a seprarate chip)

Integrating into the CHIP IMO is a wast of silicon, but it you're going to put it on every card anyways, then maybe moving that way will make economic sense, but it has to be a more open implementation than nV's which is locked in @ 1 to 1.

(and from the looks of things, ATi's may be 2 channel on chip which means they don't have the same issues nV does).

Edited to correct mistaken a$$umption.
a b U Graphics card
November 1, 2006 10:24:12 PM

FAQin CRAP!

You know you read some early reviews and you think you know for sure...

Let me check again. I was certain I read that it's not on-chip like some people thought, but the two reviews I just pulled up for results have conflicting information.

Depending on the way the Ati compositing chips work, it might not matter because there may be a limitation therein, however it's easier to upgrade a compositing chip to add new functionality than to respin a VPU and start adding features.

I'll get back to you on this, I'm just leaving work in 5mins.
November 2, 2006 11:44:53 AM

Yea, the crossfire functions are now designed directly into the GPU from what I've read. Seems a good way to do it to me, saves on costs that hopefully can be passed on to consumers.

From Anandtech: RV570 includes an integrated compositing engine for what ATI calls "native" CrossFire support. At the heart of the changes to CrossFire is the movement of ATI's compositing engine from the card onto the GPU itself.

They mention that it will make the GPU's themselves more expensive, but shouldn't this still be a cheaper alternative than fabbing a seperate chip and grafting it onto a card?

From Hardocp: ATI has placed the CrossFire compositing engine within the ASIC itself on the Radeon X1950 Pro.

Pretty much everything I can find mentions the crossfire being built into the GPU, where did you see otherwise? Anyway, IMO Crossfire is superior to SLI. I hope the trend of enabling Crossfire on other-than-ATI chipsets continues, though I doubt it will ever be enabled on an Nforce motherboard... Neither solution is anywhere near mainstream yet, and probably won't be for some time. Hopefully introduction of 65nm multi-core GPUs isn't too terrifically far in the future.
a b U Graphics card
November 2, 2006 3:40:46 PM

Quote:
Yea, the crossfire functions are now designed directly into the GPU from what I've read.


Yeah I misread the Firingsquad article as I skimmed it and thought the Xilnix chip WAS still there, they mentioned it in the context of the older card, I usually wait for the more detailed reviews like Beyond3D or Digit-Life or the TechReport (although recently they are more fluffy). I'm still hoping that someone will detail the actual integration of that portion of the chip in the X1950P and X1650XT.

Quote:
Seems a good way to do it to me, saves on costs that hopefully can be passed on to consumers.


Well like you mentioned later it's actually an added cost to general single card consumers, but it is a savings to Crossfire consumers. The only drawback is now whatever limitations are there are hardwired into the chip, not an easily replaceable board component.

Quote:
They mention that it will make the GPU's themselves more expensive, but shouldn't this still be a cheaper alternative than fabbing a seperate chip and grafting it onto a card?


Except you save that cost on cards not destined to people want Xfire, and nw every card has that even if it's never used. So it still depends on costs. But the impact should be minor for the average person, and should save some for the Xfire people. As an extreme example of how it cou;d've still been more expensive, think about if it cost them 51% of the cost of the Xilnix addition (or 50% with a higher failure rate) to integrate it into the chip. It would mean everyone pays that premium, and even the Xfire people pay 102% of the cost. But it's likely much less, and it offers much more (the problem with the high end was finding the Xfire cards at a later date (just try finding an X1800CF MASTER not on eBay).

Quote:
Pretty much everything I can find mentions the crossfire being built into the GPU, where did you see otherwise?


Well I corrected my statement about 18 hrs before you posted, so I think if you re-read you'll see it didn't say that anymore I corrected it before going home last night. But on-chip is still limited like I said, what the limits are right now, don't know, but at least it doesn't have the 1:1 limit like SLi, and is 2 channel. The only limit we know of is resolution per channel.

Quote:
Anyway, IMO Crossfire is superior to SLI. I hope the trend of enabling Crossfire on other-than-ATI chipsets continues, though I doubt it will ever be enabled on an Nforce motherboard...


It already has for quite some time, but it was always 'unofficial' and system integrators like VooDoo etc had to do it themselves with extensive hacks. But it's still not officially supported, just like the SLi hacks.

Quote:
Neither solution is anywhere near mainstream yet, and probably won't be for some time. Hopefully introduction of 65nm multi-core GPUs isn't too terrifically far in the future.


And that's jus it, as long as process shrinks can bring us noticeable leaps in performance the 'need' for Xfire/SLi is soewhat limited, the time is makes the most sense is when putting 2 500mil transistor chips on one card, or 2 cards in tandem far outwieghs the issues of trying to fab a 1Billion transistor VPU. I think we're very near that litmus test now with those 600+ Mil transistor next gen chips. Current design 300mil transistor cards aren't quite competative, but then again they are also 2 years older designs. The GF7900GTX is just under 300mil transistors, and the R580 is a fair amount over (just under 400mil), so if two GF7900GTX can match anywhere near 1 G80, then that'd be a consideration, but if the G80 still way outpaces due to improvements, then it's still too early for SLi to be a 'mainstream' concern.
November 2, 2006 4:06:30 PM

No, I agree with you completely, both Crossfire and SLI are stop-gap measures to provide demand for graphics computations that, right now, are outpacing the technology. Games like Oblivion and FSX are simply too intensive for modern GPU's. CPU's on the other hand, have had some breathing room, while new tech has been introduced, software has been a lot slower to take advantage of the improvements, and now we're in a situation where a 3 or 4 year old CPU can still "hang" as long as there's a top-of-the-line GPU in the system.

And you're right about the X1650 and X1950P probably costing more for those who don't need or want crossfire, but I'll bet the cost difference is probably pretty small in terms of real dollars. Just chalk it up to the price of progress....heh heh heh.

In the end, I think it's the GPU field that's going to be exciting to watch over the next 5 or so years. CPU advancements have been recieving most of the hype lately, but as I said, the CPU is already sitting pretty and isn't really that greatly in need of improvement. DX 10 looks poised to increase pressure on GPU manufacturers, so we'll just have to brace ourselves for a flood of products, some poorly concieved and some revolutionary.
!