SLI / CrossFire FAQs - Page 3
Tags:
- Product
- Nvidia
- Crossfire
- SLI
-
Graphics
-
Graphics Cards
Last response: in Graphics & Displays
TheForumMember
December 16, 2007 2:51:36 AM
T8RR8R
December 16, 2007 3:42:29 AM
Related resources
- SLI / CrossFire FAQs - Forum
- CrossFire FAQs - Forum
- crossfire-faqs - Forum
- Sli/ Crossfire or single GPU ? - Tech Support
- GTX 970 SLI vs R9 290 CROSSFIRE - Tech Support
wolfheart1979
December 17, 2007 11:17:03 AM
i bought 2 cards ati 3850 saphire and they are configured in crossfire.
i got a couple of problems which i would appreaciate if someone has the same problem and provides with any info :
1. after a couple of second in game call of duty 4 it crashes and when i try to open it agan it says DIRECTX ERROR ps. i am playing with ati 7.11 drivers.
My question is there any other configuration in CATALYST?
i got a couple of problems which i would appreaciate if someone has the same problem and provides with any info :
1. after a couple of second in game call of duty 4 it crashes and when i try to open it agan it says DIRECTX ERROR ps. i am playing with ati 7.11 drivers.
My question is there any other configuration in CATALYST?
-
Reply to wolfheart1979
ishbog
December 28, 2007 1:49:44 AM
chron1
January 5, 2008 7:30:40 AM
chron1
January 5, 2008 7:59:22 AM
random1283
January 8, 2008 8:16:24 AM
Is it really necessary to use SLI or CrossFire ?
Well , it really depends mostly on the resolutions and games you play , resolutions like 1920x1200 and higher benefit more from SLI or CrossFire than resolutions like 1600x1200 or lower (I dont mean SLI or CrossFire wont be good for resolutions like 1600x1200 or lower , i am just saying that SLI or CrossFire shines in higher resolutions.)
So i dont recommend CrossFire for your resolution, also :
A note about Motherboards:
Some Motherboards support Multi-GPU Technology at Dual 16x mode, some support it at Dual 8x mode and some support it at 16x 4x mode, what are the differences?
Well for gettig the best performance out of Multi-GPU,you need a Motherboard that supports Multi-GPU at
dual 16x mode like Nvidia 780i SLI, NVIDIA 680i( As i said, NVIDIA 780I and 680i support 3-Way SLI too),Nvidia 590i SLI, Nvidia nForce4 SLI 16x, ATI CrossFire Xpress 3200 or Intel X38 chipsets.
Dual 8x is a very good configuration too,it doesnt have alot of difference with Dual 16x and its cheaper too.
Chipests like Nvidia 750i SLI, Nvidia 650i SLI,NVIDIA 570i SLI, NVIDIA nForce4 SLI, ATI Radeon Xpress 200, Intel 975x and ONLY 2 Intel P35 boards(ASUS BLITZ FORMULA and ASUS BLITZ EXTREME) support Multi-GPU at dual 8x mode.
The last one is 16x 4x, this one is the weakest and doesn't perform very good compared to dual 8x or dual 16x,so i won't recommend it for Multi-GPU configuration. Intel P35 and P965 chipsets support Multi-GPU at
16x 4x mode.
Dual 8x vs 16x 4x (ASUS P5B DELUXE vs P5W-DH DELUXE)
http://www.anandtech.com/video/showdoc.aspx?i=2837
Well , it really depends mostly on the resolutions and games you play , resolutions like 1920x1200 and higher benefit more from SLI or CrossFire than resolutions like 1600x1200 or lower (I dont mean SLI or CrossFire wont be good for resolutions like 1600x1200 or lower , i am just saying that SLI or CrossFire shines in higher resolutions.)
So i dont recommend CrossFire for your resolution, also :
A note about Motherboards:
Some Motherboards support Multi-GPU Technology at Dual 16x mode, some support it at Dual 8x mode and some support it at 16x 4x mode, what are the differences?
Well for gettig the best performance out of Multi-GPU,you need a Motherboard that supports Multi-GPU at
dual 16x mode like Nvidia 780i SLI, NVIDIA 680i( As i said, NVIDIA 780I and 680i support 3-Way SLI too),Nvidia 590i SLI, Nvidia nForce4 SLI 16x, ATI CrossFire Xpress 3200 or Intel X38 chipsets.
Dual 8x is a very good configuration too,it doesnt have alot of difference with Dual 16x and its cheaper too.
Chipests like Nvidia 750i SLI, Nvidia 650i SLI,NVIDIA 570i SLI, NVIDIA nForce4 SLI, ATI Radeon Xpress 200, Intel 975x and ONLY 2 Intel P35 boards(ASUS BLITZ FORMULA and ASUS BLITZ EXTREME) support Multi-GPU at dual 8x mode.
The last one is 16x 4x, this one is the weakest and doesn't perform very good compared to dual 8x or dual 16x,so i won't recommend it for Multi-GPU configuration. Intel P35 and P965 chipsets support Multi-GPU at
16x 4x mode.
Dual 8x vs 16x 4x (ASUS P5B DELUXE vs P5W-DH DELUXE)
http://www.anandtech.com/video/showdoc.aspx?i=2837
-
Reply to Maziar
Farmer Brown
January 14, 2008 11:11:22 AM
http://multicore.amd.com/us-en/AMD-Multi-Core/Quad-Core...
Scalable graphics you crave on your budget—up to four cards all running in
ATI CrossFireX™ mode, with support for up to eight monitors
Well i think when it says up to 4,then it can support 3 cards too.
Scalable graphics you crave on your budget—up to four cards all running in
ATI CrossFireX™ mode, with support for up to eight monitors
Well i think when it says up to 4,then it can support 3 cards too.
-
Reply to Maziar
valorean
January 20, 2008 11:29:41 AM
Would using 2 cards that have more Stream Processors get more benefit from SLi and CrossFire than cards with lower Strream Processors? For instance say :
EVGA GeForce 8800 GT Video Card - 512MB DDR3, PCI Express 2.0, SLI Ready, (Dual Link) Dual DVI, HDTV, Video Card with 112 stream processors for 259.99
VS.
Visiontek Radeon HD 3870 Video Card - 512MB GDDR4, PCI Express 2.0, CrossFireX Ready, Dual DVI, HDTV, HDMI Support, Video Card with 320 stream processors for 249.99
I know it 2 different tech but its the only 2 cards i found in the same price range with a large difference in stream processors.
EVGA GeForce 8800 GT Video Card - 512MB DDR3, PCI Express 2.0, SLI Ready, (Dual Link) Dual DVI, HDTV, Video Card with 112 stream processors for 259.99
VS.
Visiontek Radeon HD 3870 Video Card - 512MB GDDR4, PCI Express 2.0, CrossFireX Ready, Dual DVI, HDTV, HDMI Support, Video Card with 320 stream processors for 249.99
I know it 2 different tech but its the only 2 cards i found in the same price range with a large difference in stream processors.
-
Reply to valorean
yay
January 20, 2008 11:49:42 AM
to valorean:
the stream processors in 88xx series are different with the stream processors in HD 38xx or HD 29xx series.
As you see in the benchmarks,HD 3850/HD 3870/ 2900XT have
320 stream processors, and for example 8800GTX has 112 stream processors, and a 8800TGX (which has 112 stream processors) beats 2900XT (which has 320 stream processors)
Here is a useful link :
http://www.hothardware.com/articles/ATI_Radeon_HD_2900_...
The 320 individual stream processing units in R600 are arranged in 4 groups of 80 SIMD arrays and each functional unit is arranged as a 5-way superscalar shader processor. In contrast, NVIDIA's G80 has up to 8 groups of 16 (128 total) fully generalized, fully decoupled, scalar, stream processors, but keep in mind the SPs in G80 run in a separate domain and can be clocked as high as 1.5GHz. In ATI's R600, each functional SP unit can handle 5 scalar floating point MAD instructions per clock. And one of the five shader processors (the fatter one in the image above) can also handle transcendentals as well. In each shader processor, there is also a branch execution unit that handles flow control and conditional operations and a number of general purpose registers to store input data, temporary values, and output data.
to yay:
yes,thanks i will edit it
the stream processors in 88xx series are different with the stream processors in HD 38xx or HD 29xx series.
As you see in the benchmarks,HD 3850/HD 3870/ 2900XT have
320 stream processors, and for example 8800GTX has 112 stream processors, and a 8800TGX (which has 112 stream processors) beats 2900XT (which has 320 stream processors)
Here is a useful link :
http://www.hothardware.com/articles/ATI_Radeon_HD_2900_...
The 320 individual stream processing units in R600 are arranged in 4 groups of 80 SIMD arrays and each functional unit is arranged as a 5-way superscalar shader processor. In contrast, NVIDIA's G80 has up to 8 groups of 16 (128 total) fully generalized, fully decoupled, scalar, stream processors, but keep in mind the SPs in G80 run in a separate domain and can be clocked as high as 1.5GHz. In ATI's R600, each functional SP unit can handle 5 scalar floating point MAD instructions per clock. And one of the five shader processors (the fatter one in the image above) can also handle transcendentals as well. In each shader processor, there is also a branch execution unit that handles flow control and conditional operations and a number of general purpose registers to store input data, temporary values, and output data.
to yay:
yes,thanks i will edit it
-
Reply to Maziar
medjohnson77
January 23, 2008 5:15:22 AM
Thanks for the info posted on here, Emp has also gave me advice on my 8600GT that I thought about putting in SLI but I have decided to RMA it to newegg yesterday and will be getting new card or cards.
I am debating either a MSI 8600GTS Dimiond series in SLI, new egg has them for $138 dollars right now with a $20 dollar rebate, and one of the main reason's I am really considering these cards is that it has a HDMI output on it. With my $100.00 credit from my 8600GT that put the cost of the two at $280.00 - $40 on the rebate -$100 = $140.00 for the sli setup. Stream processors are only 32 on a single card and not sure how good this is. Ram is only 256 DDR3 clock is 700 i believe.
I am sure that emp if he responds will tell me to get the 8800GTX or ultra, but I am not wanting to spend over $300 at this time. If I can find a good 8800 series that can beat the MSI sli setup that is around $300 with out my $100 credit I think I can swing an extra $200.
Sorry for all the info but any advice would be helpful, cuz I am sending my card back today and will need to make a choice very soon, because my new custom build will be a paper wieght until I get the new card here.
I am debating either a MSI 8600GTS Dimiond series in SLI, new egg has them for $138 dollars right now with a $20 dollar rebate, and one of the main reason's I am really considering these cards is that it has a HDMI output on it. With my $100.00 credit from my 8600GT that put the cost of the two at $280.00 - $40 on the rebate -$100 = $140.00 for the sli setup. Stream processors are only 32 on a single card and not sure how good this is. Ram is only 256 DDR3 clock is 700 i believe.
I am sure that emp if he responds will tell me to get the 8800GTX or ultra, but I am not wanting to spend over $300 at this time. If I can find a good 8800 series that can beat the MSI sli setup that is around $300 with out my $100 credit I think I can swing an extra $200.
Sorry for all the info but any advice would be helpful, cuz I am sending my card back today and will need to make a choice very soon, because my new custom build will be a paper wieght until I get the new card here.
-
Reply to medjohnson77
Your math is a bit off. Its $138 * 2 - $20 + shipping. Check the rebate info, only one per household/person. (Unless you are shipping one somewhere else, you'll only get one.) Not a big problem, but I thought I'd point that out.
8600GTS in SLI isn't enough to handle a 3850 AFAIK. The 3850 is pretty much the best sub $200 card out there right now. The 256MB 8800GT isn't a bad idea either. If I remember correctly, the 3850 and the 256MB 8800GT are pretty close, so I'd try to get either of these.
Last, not meaning to be rude, but I'm not sure the SLI sticky is the correct place to post this. Posting back to your old thread, or PMing someone would probably have been a better idea.
8600GTS in SLI isn't enough to handle a 3850 AFAIK. The 3850 is pretty much the best sub $200 card out there right now. The 256MB 8800GT isn't a bad idea either. If I remember correctly, the 3850 and the 256MB 8800GT are pretty close, so I'd try to get either of these.
Last, not meaning to be rude, but I'm not sure the SLI sticky is the correct place to post this. Posting back to your old thread, or PMing someone would probably have been a better idea.
-
Reply to 4745454b
goonting
January 30, 2008 4:17:17 PM
medjohnson77
February 6, 2008 11:32:31 AM
Oh SoS
February 8, 2008 6:33:03 PM
Oh SoS
February 8, 2008 9:53:28 PM
4745454b said:
You don't need to reboot to turn a second monitor on or off. From what a coworker says, you don't need to reboot to turn SLI on or off either. I don't know about the status of CF though.Quote:
nope, you do not need a restart with CF, even my ancient setup does not need that, all you need to do is disable it through the CCC, i have to do it with certain games, well CoH and dawn of war only(damn relic) and also if i want to rotate my display to portrait mode as forums like toms display better especially when they get changed and don't expand to fill the damn screen.(hint, hint Toms.)I said "typically"...heh.
I haven't owned an ATI card in over a decade, but I know from personal experience that disabling/enable SLI often requires a restart...it seems to be dependent on the amount of virtual memory & ram in use.
And for someone who tends to leave the computer running w/ anti-virus running 24/7, rebooting (and clearing the virtual memory & ram) is not necessarily a bad thing...just kind of annoying when I want to go from playing a game to watching a movie on my big screen.
-
Reply to Oh SoS
hassa
February 8, 2008 10:36:08 PM
Oh SoS said:
Another thing to add:Neither SLI or CrossFire will support the use of multiple monitors when enabled!
This means you will have to disable the feature in order to get multiple monitor display (and enabling or disabling the feature typically requires a reboot).
Crossfirex will support more then one monitor in CrossFire mode (sweeeeeeeeeeeeeet)
-
Reply to hassa
First of all again thank you 4745454b and also strangestr anger for helping me in this thread.
http://www.anandtech.com/video/showdoc.aspx?i=3151&p=4
Aside from potential performance scalability, there is also the capability to support up to 8 monitors from one system with 4 graphics cards installed. While this isn't as universally desired, it could be something fun to play with. We don't currently have a platform solution that we can use to test this yet, but we will certainly test this when we are able.
http://www.anandtech.com/video/showdoc.aspx?i=3151&p=4
Aside from potential performance scalability, there is also the capability to support up to 8 monitors from one system with 4 graphics cards installed. While this isn't as universally desired, it could be something fun to play with. We don't currently have a platform solution that we can use to test this yet, but we will certainly test this when we are able.
-
Reply to Maziar
klobnitrones
February 12, 2008 12:57:14 AM
klobnitrones
February 12, 2008 12:58:03 AM
http://www.hothardware.com/articles/R680_Has_Landed_ATI...
Also note that the 3870 X2 has only a single CrossFire edge connector along the top of its PCB. It has only one because the other connection is already utilized on the PCB. Although the Radeon HD 3870 X2 is equipped with a CrossFire connector, at this time drivers are not available that will allow end users to link two of these cards together for quad-GPU CrossFireX. Those drivers are coming though.
Also note that the 3870 X2 has only a single CrossFire edge connector along the top of its PCB. It has only one because the other connection is already utilized on the PCB. Although the Radeon HD 3870 X2 is equipped with a CrossFire connector, at this time drivers are not available that will allow end users to link two of these cards together for quad-GPU CrossFireX. Those drivers are coming though.
-
Reply to Maziar
Oh SoS
February 22, 2008 9:12:42 PM
stoner133
March 24, 2008 8:54:47 PM
Well I found an error right off the bat. You say that CrossfireX requires a motherboard with four PCIe slots. This is not true, My Gigabyte GA-MA790FX-DS5 supports CrossfireX with only two slots. How? Well you can use two 3870X2 cards on this board and then you have CrossfireX.
There is also a DFI board with the 790FX chipset that supports CrossfireX and it only has three PCIe slots.
There is also a DFI board with the 790FX chipset that supports CrossfireX and it only has three PCIe slots.
-
Reply to stoner133
circeseye
March 24, 2008 9:16:09 PM
a couple things to fix on your post. where you discribed all the cards in each series
1. the 3870gx2 is NOT 2 cards (like nvidea) . its 2 gpus on one card
2. you do not need a crossfire edition card with the 2xxx and 3xxx series cards all cards in the series can crossfire. if memory serves the 1950 i believe didnt need it eather but the 1900 and down does
other then the little fixes good post.
1. the 3870gx2 is NOT 2 cards (like nvidea) . its 2 gpus on one card
2. you do not need a crossfire edition card with the 2xxx and 3xxx series cards all cards in the series can crossfire. if memory serves the 1950 i believe didnt need it eather but the 1900 and down does
other then the little fixes good post.
-
Reply to circeseye
stoner133
March 24, 2008 9:33:56 PM
circesey is correct, 1800 and 1900 series cards needed a Crossfire Master card and it connected to the second card with a Dongle cable between the two cards on the outside of the case. The did away with that with the 1950 cards.
The 3870X2 sort of works like two 1900's but instead it has the Crossfire chip between the two GPU's on the one card and this chip is identical to the chip used on the 1900 Crossfire Master card.
The 3870X2 sort of works like two 1900's but instead it has the Crossfire chip between the two GPU's on the one card and this chip is identical to the chip used on the 1900 Crossfire Master card.
-
Reply to stoner133
kpo6969
March 25, 2008 7:10:52 AM
nukemaster said:
I have to agree with Maziar on this one....it IS 2 cards since if it was just 2 gpus on one card they would not need video ram for each gpu...just one set of ram for both.....they are just crossfire on a stick. so its JUST like nvidia.1 card
http://en.expreview.com/2007/12/16/r680-pcb-and-stock-c...
http://www.quantum-force.net/tutorials/T000000006/
-
Reply to kpo6969
I disagree with Maz and nuke. The 3870x2 has only one PCB, hence it is one card. The 9800GX2 has two PCBs, so it is two cards. You can argue against this all you want, but there is only one PCB.
In the grand scheme of things, who cares. Be aware that each uses SLI/CF, and each card has its own benefits/draw backs. Whether it is one card or two doesn't really matter that much.
In the grand scheme of things, who cares. Be aware that each uses SLI/CF, and each card has its own benefits/draw backs. Whether it is one card or two doesn't really matter that much.
-
Reply to 4745454b
- First
- Previous
- 3 / 24
- 4
- 5
- 6
- 7
- … More pages
- Next
- Newest
- 1
- 2
- 3 / 24
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 20
Related resources
- Solvedradeon r9 280x crossfire vs gtx 770 sli solution
- SolvedAny Crossfire/SLI gpu's worth 140-180? solution
- SolvedCrossfire SLI Help? solution
- SolvedSLI/Crossfire and dual monitors solution
- SolvedSLI and Crossfire Complications solution
- SolvedRadeon R9 290 crossfire, gtx 770 sli or gtx 780 ti solution
- SolvedGTX 770 SLI or R9 280X Crossfire? solution
- Solved280x crossfire vs 770 sli or 780 ti solution
- SolvedTo Upgrade or not ... GTX 680 SLI --> AMD R9 290 Crossfire solution
- SolvedMSI 770 4GB SLI vs Asus DCII R9 290 Crossfire? solution
- SolvedWill using a second video card help performance (non crossfire sli)? solution
- SolvedGTX Titan Black 4 way SLI (vs) AMD 295x2 Crossfire? solution
- Solvedh440 crossfire/sli airflow? solution
- SolvedMotherboards that are good for crossfire or sli solution
- SolvedWhat is better, SLI Or Crossfire? solution
- More resources
Read discussions in other Graphics & Displays categories
!