Will ATI DX10 Cards fit current Crossfire Motherboards?

nevynev

Distinguished
Jun 30, 2006
68
0
18,630
Hi there,

I've got an Asus A8R-MVP mobo with a x1800xt currently. Whenever ATI release their DX10 cards do you think they'll fit on the this (and other) crossfire mobo?

Also small side question - will they make crossfire DX10 cards?

Thanks
NevyNev
 

Grated

Distinguished
Aug 26, 2006
57
0
18,630
Most probably the crossfire mobo's will work with crossfire R6** series ;).

And yes there will be Crossfire R6** series. However, not with master cards but luke the current X1950 Pro.

Greetz
 

JeanLuc

Distinguished
Oct 21, 2002
979
0
18,990
They should do, I've also read in Custom PC that Ati/AMD are doing away with the need for a 'mastercard', so the future looks bright for crossfire.
 

Eurasianman

Distinguished
Jul 20, 2006
883
0
19,010
So, I guess if they were to come out with new ATI DX 10 cards, they still would be using the crossfire in x8 x8???

Then again, I once read that graphics cards don't even utilize x16 all the way...

Wasn't that the same issue for AGP 8x???

*sighs* Within the next year or two... there will be PCI Express Gen 2... x.x

Do we really need this? :roll:
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
Yes, they will work, because they are PCI-E, so is the mobo.

There also will be Crossfire for their next-gen card. I think they'll work with current Crossfire mobos as well.
 

nevynev

Distinguished
Jun 30, 2006
68
0
18,630
Good! I thought I might have had to dish out loadsa cash for a new mobo etc. Will the new cards / DX10 have any requirements (e.g. a certian type of CPU? Mine is Athlon X2 4400)?
 

Grated

Distinguished
Aug 26, 2006
57
0
18,630
no they will not have specific hardware requirements that you don't already meet now.

The only thing could be a PSU. For everything else, no problem
 
So, I guess if they were to come out with new ATI DX 10 cards, they still would be using the crossfire in x8 x8???

They use both now, and likely will support both then too.
Xfire like SLi can be 16x, 8x, and even 4x.

*sighs* Within the next year or two... there will be PCI Express Gen 2... x.x

Do we really need this? :roll:

Yes, because PCIe 2.0 will also add 75 watts across the bus doubling the power to the card to 150W, meaning a GF8800 and R600 would only need 1 six pin connector, not 2.

Everyone thininks it's only about speed, but there's more to this stuff than just throughput.

If they had jumped to PCIe 150W from the start they would've been fine with the GF7 and X1K series card with no power connector, now I doubt they can do 225-250W without some serious power management issues on the MoBo, especially with multiple cards.
 

human_error

Distinguished
Dec 22, 2006
104
0
18,680
The OP's mobo has the crossfire 3200 chipset - which can support 16x each pci-e slot in crossfire, and i'd hope they'll support the r600 in crossfire, i have the delux version of that mobo :lol:

The question i want answering is will we have enough room in our cases for 2 of the r600's AND a pci sound card?? I hear the card may be a little "fat"...
 

human_error

Distinguished
Dec 22, 2006
104
0
18,680
I thought PCI Express Gen 2 is come out in 4 quarter of 2007. Along with HyperTransport 3.0

IF they doing away with the need for a 'mastercard'. Do you mean? No need for two card. So 2 cards in 1?

Crossfire 2 is already out - instead on needing a master and slave card, connected with a dongle, the crossfire 2 cards can just have 2 normal cards with the same gpu plugged in and run in crossfire - no dongle needed, no master card needed. Some of the x1650's and i think x1950 pro's (?) use this.

You still need 2 cards for crossfire - that's what crossfire is, it's just newer cards don't need a master/slave setup - just 2 normal cards (the ATI R600 GPU coming out soon will have this i believe).
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
word. I just ignore awkward postings with stupid fonts.

I am waiting to upgrade to C2D until the official (or at least credible) R600 v. G80 benchmarks come out. I forsee getting a large display, so I want to setup the upgrade path (just in case, may not even use it lol).
 

kugi

Distinguished
Nov 2, 2006
72
0
18,630
If they made the GPUs 65nm instead of 90nm thats alot of power saved there.
and if they used smaller memory too thats even more power saved.
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
If they made the GPUs 65nm instead of 90nm thats alot of power saved there.
and if they used smaller memory too thats even more power saved.

I thought ATI's new GPU's were moving to 80nm production process, not 65nm. Am I misinformed?
 

dsidious

Distinguished
Dec 9, 2006
285
0
18,780
Hmmmm, I finally see a good reason why AMD and ATI bothered to merge... Maybe AMD can help ATI move to 65 nm and get ahead of Nvidia for a while. Guys, would it be a good thing if Intel bought Nvidia, or not, what do you think?
 

SuperFly03

Distinguished
Dec 2, 2004
2,514
0
20,790
I honestly don't think Intel will, too much cannibalization of their own sales. Intel already makes a worthwhile IGP. To purchase nVidia would introduce significant overlap in a very high volume segment of the market. While they could integrate the two, I am just not sure it is a great idea. I think because Intel is in the middle of trimming excess fat due to their 2 year hiring spree, the cost and headaches caused by an acquisition would be more than they need right now. Now a joint-venture may crop up in an effort to tackle and counter AMD's Fusion, but that is about all I would expect. It is also possible that Intel could do it entirely solo since they make IGP's and CPU's already on their own at their own facilities. They don't technically need nVidia to counter fusion, but may lean on them for their GPU expertise in the discrete arena (who knows why lol).