SLI / CrossFire FAQs - Page 4
Tags:
- Product
- Nvidia
- Crossfire
- SLI
-
Graphics
-
Graphics Cards
Last response: in Graphics & Displays
And here is the final answer:
http://www.anandtech.com/video/showdoc.aspx?i=3209
AMD is the first out of the gates with the Radeon HD 3870 X2, based on what AMD is calling its R680 GPU. Despite the codename, the product name tells the entire story: the Radeon HD 3870 X2 is made up of two 3870s on a single card.
http://www.anandtech.com/video/showdoc.aspx?i=3209
AMD is the first out of the gates with the Radeon HD 3870 X2, based on what AMD is calling its R680 GPU. Despite the codename, the product name tells the entire story: the Radeon HD 3870 X2 is made up of two 3870s on a single card.
-
Reply to Maziar
Let me fix the quote for you.
The 9800GX2 is two PCBs facing each other, bleeding heat into the same heatsink, with a fan running air over it for cooling. Each PCB has a single GPU on it. The 3870x2 is a single PCB, with two GPUs on them. The one in the back has an aluminum cooler, with the GPU in the front has a copper one. It does indeed have two 3870s on it, but they are facing the same direction, on the same PCB. If you count "cards" based off of the number of PCBs, then the 9800GX2 is a dual card, while the 3870x2 is not.
Quote:
AMD is the first out of the gates with the Radeon HD 3870 X2, based on what AMD is calling its R680 GPU. Despite the codename, the product name tells the entire story: the Radeon HD 3870 X2 is made up of two 3870 GPUs on a single card.The 9800GX2 is two PCBs facing each other, bleeding heat into the same heatsink, with a fan running air over it for cooling. Each PCB has a single GPU on it. The 3870x2 is a single PCB, with two GPUs on them. The one in the back has an aluminum cooler, with the GPU in the front has a copper one. It does indeed have two 3870s on it, but they are facing the same direction, on the same PCB. If you count "cards" based off of the number of PCBs, then the 9800GX2 is a dual card, while the 3870x2 is not.
-
Reply to 4745454b
Related resources
- SLI / CrossFire FAQs - Forum
- CrossFire FAQs - Forum
- crossfire-faqs - Forum
- Sli/ Crossfire or single GPU ? - Tech Support
- GTX 970 SLI vs R9 290 CROSSFIRE - Tech Support
I do not count PCB's as a card each.
By that logic QuadFX was a quad core just like intels(and not 2 dual cores with all the downfalls people mentioned....even intel is 2 x dual cores anyway). since it was on the same board with 2 sockets and 2 memory controllers.
the X2 and GX2's do the same thing with one or two boards they still have 2 full cards(minus the video outputs..). And thats the limit since each card has to have its own memory....imagine if they could have 1gig and both cards could use it(like how core2 has a shared cache. now that would be a dual core gpu and not two cards slapped together). As it stands now, high res and high AA do effect the small 512megabyte buffer.
Just my 2 cents
By that logic QuadFX was a quad core just like intels(and not 2 dual cores with all the downfalls people mentioned....even intel is 2 x dual cores anyway). since it was on the same board with 2 sockets and 2 memory controllers.
the X2 and GX2's do the same thing with one or two boards they still have 2 full cards(minus the video outputs..). And thats the limit since each card has to have its own memory....imagine if they could have 1gig and both cards could use it(like how core2 has a shared cache. now that would be a dual core gpu and not two cards slapped together). As it stands now, high res and high AA do effect the small 512megabyte buffer.
Just my 2 cents
-
Reply to nukemaster
@Maz, because they are counting the GPUs like you are. Two 3870x2 in CF would be four 3870 GPUs, hence "quadfire". As I said, its all how your count. PCBs, you have two. GPUs, you have four.
@nuke, actually you might have just proved my point.
The x2 only has the three standard outputs that all video cards have. The 9800GX2 however has four, two for each card. Go read some reviews and figure out whats underneath those heatsinks. Obviously you guys aren't understanding/believing me.
3870x2 http://www.newegg.com/Product/Product.aspx?Item=N82E168...
9800GX2 http://www.newegg.com/Product/Product.aspx?Item=N82E168...
@nuke, actually you might have just proved my point.
Quote:
the X2 and GX2's do the same thing with one or two boards they still have 2 full cards(minus the video outputs..)The x2 only has the three standard outputs that all video cards have. The 9800GX2 however has four, two for each card. Go read some reviews and figure out whats underneath those heatsinks. Obviously you guys aren't understanding/believing me.
3870x2 http://www.newegg.com/Product/Product.aspx?Item=N82E168...
9800GX2 http://www.newegg.com/Product/Product.aspx?Item=N82E168...
-
Reply to 4745454b
4745454b said:
The x2 only has the three standard outputs that all video cards have. The 9800GX2 however has four, two for each card. Go read some reviews and figure out whats underneath those heatsinks. Obviously you guys aren't understanding/believing me.
3870x2 http://www.newegg.com/Product/Product.aspx?Item=N82E168...
9800GX2 http://www.newegg.com/Product/Product.aspx?Item=N82E168...
I only see 3 outputs 2 dvi and one hdmi, but for all i know the DVI and HDMI are linked(like on many dvi/hdmi onboard cards)....
Either way its not 4 so
I know EXACTLY whats under the heatsinks....2 cards on one pcb(ATI) and 2 cards on 2 pcbs. Either way the lack of Svid for SD tv users on the 9800GX2 SUCKS.....
Look X2 with 4 video outputs
http://www.techspot.com/review/86-ati-radeon-hd-3870-x2...
-
Reply to nukemaster
The reason Anand says it is two is because there are two GPUs, and you are already using one of the CF links. When you say card, I am old school and picture the PCB. Its a bit like saying the C2Ds is two CPUs. It's two cores, on one PCB. I don't see how this is any different.
@nuke, I was referring to the only pic of the 9800GX2 that I've seen. Looking back at it, it is the 2 DVI ports, one DMI, and the fourth that I saw is probably an optical out for audio. Tell me what you think it is.
http://www.tomshardware.com/2008/01/05/exclusive_nvidia...
@nuke, I was referring to the only pic of the 9800GX2 that I've seen. Looking back at it, it is the 2 DVI ports, one DMI, and the fourth that I saw is probably an optical out for audio. Tell me what you think it is.
http://www.tomshardware.com/2008/01/05/exclusive_nvidia...
-
Reply to 4745454b
I would almost call a PD and core2 Quad 2 cores(well 2 dies) on one pcb(substraight).
Anyway, there is not point in auguring about this. Its a video card....good enough....
Oddly they appear to have dropped the optical port. Any clue if Nvidia even has an onboard sound card for HDMI or could the optical have been an input for HDMI pass through?(warning wild ass guess!!!)
A quick google shows this may be true...
http://ketzone.com/blog/?p=152
Maybe it was dropped because people thought it was an output when in fact its just an input for the HDMI port.
Anyway, there is not point in auguring about this. Its a video card....good enough....
Oddly they appear to have dropped the optical port. Any clue if Nvidia even has an onboard sound card for HDMI or could the optical have been an input for HDMI pass through?(warning wild ass guess!!!)
A quick google shows this may be true...
http://ketzone.com/blog/?p=152
Maybe it was dropped because people thought it was an output when in fact its just an input for the HDMI port.
-
Reply to nukemaster
TuVNeRa
April 1, 2008 6:18:56 PM
If I'm reading everything right, there is no problem. If you have one 3870x2, you can't/don't need to enable CF. The CF link is made internally on the card, its invisible to the user. This also allows you to use the 3870x2 on non CF boards, either single slot or Nvidia chipset motherboards. You would need two 3870x2 to enable CF, which I don't see in your device manager. (you have two 3870x2's listed, but you should get one listing for each DVI port. Thats at least how it works with my older x1800xt.)
Anyone feel free to correct me if I'm wrong.
Anyone feel free to correct me if I'm wrong.
-
Reply to 4745454b
TuVNeRa
April 1, 2008 7:38:51 PM
blotch
April 1, 2008 8:29:38 PM
You may want to include something about the stability of 1 card vs multiple cards. Moving from one 1600xt to 2 and when moving from one 3870xt to 2, I started having some stability problems.
Also I love my crossfire setups but when i moved from 2 1600xt's to a 1900xt it was an insane improvement. I know that there are lots of reasons for the difference but even though adding more cards to a system will improve performance I still like the idea of the single "Monolithic" card like mine was for me or how the 8800GTX was for a long time.
I got excited when the 1GB-512bit 2900xt's came out, until they sucked. I hope ATI tries that again with these newer/multi cores to make crossfire more effective. I think even 1gb-256bit may work with PCI-E 2.0 but I wouldnt know if a 256bit bus is wide enough for a Gig of GDDR3/4.
Anywho nice post.
Also I love my crossfire setups but when i moved from 2 1600xt's to a 1900xt it was an insane improvement. I know that there are lots of reasons for the difference but even though adding more cards to a system will improve performance I still like the idea of the single "Monolithic" card like mine was for me or how the 8800GTX was for a long time.
I got excited when the 1GB-512bit 2900xt's came out, until they sucked. I hope ATI tries that again with these newer/multi cores to make crossfire more effective. I think even 1gb-256bit may work with PCI-E 2.0 but I wouldnt know if a 256bit bus is wide enough for a Gig of GDDR3/4.
Anywho nice post.
-
Reply to blotch
TuVNeRa
April 1, 2008 9:17:49 PM
vincegreg
April 28, 2008 9:27:28 PM
vincegreg
April 28, 2008 9:27:53 PM
fast_furious
April 29, 2008 2:09:31 PM
-
Reply to fast_furious
Hello and welcome to the forums mate
Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!
Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!
-
Reply to Maziar
fast_furious
April 29, 2008 10:29:05 PM
Maziar said:
Hello and welcome to the forums mate
Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!
thx mate, do u know someone that did it ? site's mates?
thx
-
Reply to fast_furious
fatty35
May 14, 2008 4:08:50 PM
Just wanted to add a couple links supporting the usefulness of SLI at low res in rare cases like Crysis. You have to look at the 9800GX2 which is basically SLI G92 in a dual PCB single PCI-e slot fashion.
Crysis very high:
1280x1024 4xaa/16xaf
http://www.digit-life.com/articles2/digest3d/0408/itogi...
Notice the spanking the GX2 does to a single G92 like the 8800GTS 512MB. 30 fps vs 18 fps. No single GPU hits 19 fps in this one.
1280x1024 very high no aa/af
http://www.digit-life.com/articles2/digest3d/0408/itogi...
Even without fsaa/af, it's still 35 fps vs 22 fps showing the playable difference of SLI even at 12x10 no fsaa.
Now going down to 1024x768 with aa, there is still a big difference. 37 fps vs 25 fps:
http://www.digit-life.com/articles2/digest3d/0408/itogi...
And at a puny 1024x768 no fsaa, it's 38 fps vs 27 fps average.
http://www.digit-life.com/articles2/digest3d/0408/itogi...
There's proof right there that SLI doesn't need high resolution to scale properly. It just needs GPU demanding settings at whatever the resolution.
Anyway, you will notice that the 9800GX2 leads alot of their low res gaming tests. But in most games the single GPU's still do well enough. If you don't hit a CPU limitation, SLI can scale well at low res, whether or not you need dual gpu's to be playable is a different story. But Without question Crysis very high warrants SLI at any resolution as a single GPU struggles.
This goes along with why I think 9600GT or 8800GT SLI is a good option for people to consider over a single GPU of equal pricing. More often than not 9600GT SLI will beat a single 9800GTX, and sometimes it's a crushing. And to be honest, SLI and driver support have matured alot. It would be the rare game where it would lose because of SLI issues, and most of the time those games a single midrange 9600GT or 8800GT will be plenty anyway. It could be a new game needs a patch or driver support for sli to work properly or scale properly. But TWIMTBP usually makes that a top priority for NV to save face. Not saying there aren't good reasons to consider a single GPU vs SLI (there are), but just there are also good reasons to consider certain SLI midrange solutions over a top single gpu. The introduction of the 9600GT and low 8800GT prices have made them excellent SLI options.
Crysis very high:
1280x1024 4xaa/16xaf
http://www.digit-life.com/articles2/digest3d/0408/itogi...
Notice the spanking the GX2 does to a single G92 like the 8800GTS 512MB. 30 fps vs 18 fps. No single GPU hits 19 fps in this one.
1280x1024 very high no aa/af
http://www.digit-life.com/articles2/digest3d/0408/itogi...
Even without fsaa/af, it's still 35 fps vs 22 fps showing the playable difference of SLI even at 12x10 no fsaa.
Now going down to 1024x768 with aa, there is still a big difference. 37 fps vs 25 fps:
http://www.digit-life.com/articles2/digest3d/0408/itogi...
And at a puny 1024x768 no fsaa, it's 38 fps vs 27 fps average.
http://www.digit-life.com/articles2/digest3d/0408/itogi...
There's proof right there that SLI doesn't need high resolution to scale properly. It just needs GPU demanding settings at whatever the resolution.
Anyway, you will notice that the 9800GX2 leads alot of their low res gaming tests. But in most games the single GPU's still do well enough. If you don't hit a CPU limitation, SLI can scale well at low res, whether or not you need dual gpu's to be playable is a different story. But Without question Crysis very high warrants SLI at any resolution as a single GPU struggles.
This goes along with why I think 9600GT or 8800GT SLI is a good option for people to consider over a single GPU of equal pricing. More often than not 9600GT SLI will beat a single 9800GTX, and sometimes it's a crushing. And to be honest, SLI and driver support have matured alot. It would be the rare game where it would lose because of SLI issues, and most of the time those games a single midrange 9600GT or 8800GT will be plenty anyway. It could be a new game needs a patch or driver support for sli to work properly or scale properly. But TWIMTBP usually makes that a top priority for NV to save face. Not saying there aren't good reasons to consider a single GPU vs SLI (there are), but just there are also good reasons to consider certain SLI midrange solutions over a top single gpu. The introduction of the 9600GT and low 8800GT prices have made them excellent SLI options.
-
Reply to pauldh
No problem. I saw you made the changes and then just saw those Digit-life very high crysis tests this morning and I thought they were worth getting into this thread for some hard data to back up the changes. These were the best low res, very high Crysis benchies I have seen.
Anyway, for anyone considering a high end GPU like a 9800GTX vs. SLI 9600GT. For $250-300 right now (USA anyway), Save money and buy a 8800GTS G92, or go SLI 9600GT or SLI 8800GT. Obviously exact price for everyone (around the globe) varies, so recommendations would change.
Check each game here: http://www.firingsquad.com/hardware/nvidia_geforce_9800...
-
Reply to pauldh
Jeanmarie576
May 29, 2008 1:15:12 AM
I know this might sound ridiculous after reading this very informative info. about SLI /crossfire. Well I still have something ask which I didn't quite get. I have a sli main board with a Nvidia chip set, running an ATI X1600XT (crossfire card). Now the question is, Can I use another ATI X1600XT (Cross fire or non Crossfire card) to make a SLI? And thanks to you Mazier for this post. I have indeed learned alot what I didn't know before.
-
Reply to Jeanmarie576
Jeanmarie576 said:
I know this might sound ridiculous after reading this very informative info. about SLI /crossfire. Well I still have something ask which I didn't quite get. I have a sli main board with a Nvidia chip set, running an ATI X1600XT (crossfire card). Now the question is, Can I use another ATI X1600XT (Cross fire or non Crossfire card) to make a SLI? And thanks to you Mazier for this post. I have indeed learned alot what I didn't know before.The short answer is no. The long answer is that it is or at least was possible with hacked drivers on the some hardware. I'd say go with the short answer as the chances of the long one working for you are likely nil.
-
Reply to pauldh
Thanks alot Paul for your help
To Jeanmarie:
I have talked about it in the first page:
Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!
And by the way, i am glad you liked it and find it useful
To Jeanmarie:
I have talked about it in the first page:
Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!
And by the way, i am glad you liked it and find it useful
-
Reply to Maziar
Jeanmarie576
May 29, 2008 10:44:33 PM
-
Reply to Maziar
Eleazaros
June 8, 2008 1:26:17 AM
I read the article, and several others and have a couple questions/comments. I'm looking at picking up a new high end system soon and the SLI/Crossfire issue came up for me so I've been researching it while shopping for the system.
First the "proprietary" hardware architecture raises a frown from me from both manufacturers. I'm not much into the old system of "my way" -- MicroChannel flopped with that versus the more open EISA standard, which was slower, at that generation and other such attempts at proprietary hardware have flopped over time due to folks getting annoyed about it but, if a "standard" is to be born, it often starts in such a way so it's something I can live with...
One thing that struck me about this all of this: How the 2 technologies seem to run. with respect to "operating" and "deployment/maintenance" If you find errors in this, please let me know.
SLI uses a recognition model similar to the old Voodoo 2 model of drivers that recognize the software "image" for multiple cards to function for a software package. If the drivers don't recognize it, the video config drops back to a 'default' behavior of 1 card and simply skips all the fancy multi-card stuff so, for most "non-supporting" software, you gain nothing from having 2 cards.
That old "Voodoo" model was hel". Every time any of your games or apps were updated, the "image" changed so the card would stop working in dual mode until you updated the drivers. With modern day update services, this shouldn't be a big issue -- it can automate the checking 'in the background' -- but is that going to be from NVidia or from each app manufacturer? (as in any software supporting the system will have to submit app image information to NVidia and NVidia will release the driver updates "when they get to it" or what not... Then you have the "fragmentation" potential as you have 20 different games and apps that get patched differently so each goes out to update the drivers or a consolidated update done daily or what not (meaning you lose "multi-card support" until drivers are updated at times...) See where I'm going with "image" and "drivers"? A LOT of potential changes to your "drivers" if this is still the model being proposed -- just picture a "bad update" where NOTHING recognizes 2 cards or the like...
Crossfire is more transparent to the users. It tries to work with whatever is being sent to the video card. It doesn't require custom drivers to be updated for the cards to recognize the "supporting software" versus the cards trying to render whatever is to be displayed on the screen as best as it is able to. It also only uses the "single card" fall back as a "last resort" after trying to render it with multiple cards based upon "advanced" user configurable settings in the drivers.
Now -- this I gleaned from a few different articles and the issues don't seem covered very well but... As you can see, if one is putting the load on companies to release driver updates with every patch from "hundreds" of application companies while the other is transparent... I think I know which one I'll be more comfortable choosing. That transparency is something I kind of like versus a potentially "busy system" as it keeps updating drivers...
Again, if you have any info on this maintenance "use" side of the 2 technologies, I'd appreciate it. I really don't have much on it beyond references to how SLI decides to "fall back to default mode" and how the Crossfire has configuration options on how multiple cards should handle video output "by default"...
First the "proprietary" hardware architecture raises a frown from me from both manufacturers. I'm not much into the old system of "my way" -- MicroChannel flopped with that versus the more open EISA standard, which was slower, at that generation and other such attempts at proprietary hardware have flopped over time due to folks getting annoyed about it but, if a "standard" is to be born, it often starts in such a way so it's something I can live with...
One thing that struck me about this all of this: How the 2 technologies seem to run. with respect to "operating" and "deployment/maintenance" If you find errors in this, please let me know.
SLI uses a recognition model similar to the old Voodoo 2 model of drivers that recognize the software "image" for multiple cards to function for a software package. If the drivers don't recognize it, the video config drops back to a 'default' behavior of 1 card and simply skips all the fancy multi-card stuff so, for most "non-supporting" software, you gain nothing from having 2 cards.
That old "Voodoo" model was hel". Every time any of your games or apps were updated, the "image" changed so the card would stop working in dual mode until you updated the drivers. With modern day update services, this shouldn't be a big issue -- it can automate the checking 'in the background' -- but is that going to be from NVidia or from each app manufacturer? (as in any software supporting the system will have to submit app image information to NVidia and NVidia will release the driver updates "when they get to it" or what not... Then you have the "fragmentation" potential as you have 20 different games and apps that get patched differently so each goes out to update the drivers or a consolidated update done daily or what not (meaning you lose "multi-card support" until drivers are updated at times...) See where I'm going with "image" and "drivers"? A LOT of potential changes to your "drivers" if this is still the model being proposed -- just picture a "bad update" where NOTHING recognizes 2 cards or the like...
Crossfire is more transparent to the users. It tries to work with whatever is being sent to the video card. It doesn't require custom drivers to be updated for the cards to recognize the "supporting software" versus the cards trying to render whatever is to be displayed on the screen as best as it is able to. It also only uses the "single card" fall back as a "last resort" after trying to render it with multiple cards based upon "advanced" user configurable settings in the drivers.
Now -- this I gleaned from a few different articles and the issues don't seem covered very well but... As you can see, if one is putting the load on companies to release driver updates with every patch from "hundreds" of application companies while the other is transparent... I think I know which one I'll be more comfortable choosing. That transparency is something I kind of like versus a potentially "busy system" as it keeps updating drivers...
Again, if you have any info on this maintenance "use" side of the 2 technologies, I'd appreciate it. I really don't have much on it beyond references to how SLI decides to "fall back to default mode" and how the Crossfire has configuration options on how multiple cards should handle video output "by default"...
-
Reply to Eleazaros
ErikS22
June 13, 2008 5:35:19 PM
jtabler
June 23, 2008 3:04:27 PM
I think it would be nice to have a bit about multiple monitors as well.
1) You have 2 or three monitors and two GTX 260s or 3870 X2s . Lets say your middle monitor is 30" and the side monitors are 20-24". Let's also say you have three 20" full screen.
What are the possibilities for gaming? For gaming, will the side monitors be disabled and both cards will power the single monitor? Will all three monitors remain active with the primary monitor getting a boost from the second card? Can all three monitors be used in any game without a solution such as TripleHead2Go?
2) What about dual monitor support? How do these multi card solutions work with them?
I've been told that ATI supports triple monitors by allowing the side monitors to be disabled automatically, whereas Nvidia makes it a pain.
I don't know any of these answers, but sure would like to know. I think it would be a great addition to your fantastic FAQ.
1) You have 2 or three monitors and two GTX 260s or 3870 X2s . Lets say your middle monitor is 30" and the side monitors are 20-24". Let's also say you have three 20" full screen.
What are the possibilities for gaming? For gaming, will the side monitors be disabled and both cards will power the single monitor? Will all three monitors remain active with the primary monitor getting a boost from the second card? Can all three monitors be used in any game without a solution such as TripleHead2Go?
2) What about dual monitor support? How do these multi card solutions work with them?
I've been told that ATI supports triple monitors by allowing the side monitors to be disabled automatically, whereas Nvidia makes it a pain.
I don't know any of these answers, but sure would like to know. I think it would be a great addition to your fantastic FAQ.
-
Reply to jtabler
- First
- Previous
- 4 / 24
- 5
- 6
- 7
- 8
- … More pages
- Next
- Newest
- 1
- 2
- 3
- 4 / 24
- 5
- 6
- 7
- 8
- 9
- 10
- 20
Related resources
- Solvedradeon r9 280x crossfire vs gtx 770 sli solution
- SolvedAny Crossfire/SLI gpu's worth 140-180? solution
- SolvedCrossfire SLI Help? solution
- SolvedSLI/Crossfire and dual monitors solution
- SolvedSLI and Crossfire Complications solution
- SolvedRadeon R9 290 crossfire, gtx 770 sli or gtx 780 ti solution
- SolvedGTX 770 SLI or R9 280X Crossfire? solution
- Solved280x crossfire vs 770 sli or 780 ti solution
- SolvedTo Upgrade or not ... GTX 680 SLI --> AMD R9 290 Crossfire solution
- SolvedMSI 770 4GB SLI vs Asus DCII R9 290 Crossfire? solution
- SolvedWill using a second video card help performance (non crossfire sli)? solution
- SolvedGTX Titan Black 4 way SLI (vs) AMD 295x2 Crossfire? solution
- Solvedh440 crossfire/sli airflow? solution
- SolvedMotherboards that are good for crossfire or sli solution
- SolvedWhat is better, SLI Or Crossfire? solution
- More resources
Read discussions in other Graphics & Displays categories
!

