Sign in with
Sign up | Sign in
Your question

Will ATI DX10 Cards fit current Crossfire Motherboards?

Last response: in Graphics & Displays
Share
January 5, 2007 10:03:08 AM

Hi there,

I've got an Asus A8R-MVP mobo with a x1800xt currently. Whenever ATI release their DX10 cards do you think they'll fit on the this (and other) crossfire mobo?

Also small side question - will they make crossfire DX10 cards?

Thanks
NevyNev
January 5, 2007 10:25:00 AM

Most probably the crossfire mobo's will work with crossfire R6** series ;) .

And yes there will be Crossfire R6** series. However, not with master cards but luke the current X1950 Pro.

Greetz
January 5, 2007 10:37:24 AM

They should do, I've also read in Custom PC that Ati/AMD are doing away with the need for a 'mastercard', so the future looks bright for crossfire.
Related resources
January 5, 2007 10:56:38 AM

CF gives me a good excuse to get a 30" widescreen monitor :wink:
January 5, 2007 12:12:50 PM

So, I guess if they were to come out with new ATI DX 10 cards, they still would be using the crossfire in x8 x8???

Then again, I once read that graphics cards don't even utilize x16 all the way...

Wasn't that the same issue for AGP 8x???

*sighs* Within the next year or two... there will be PCI Express Gen 2... x.x

Do we really need this? :roll:
January 5, 2007 12:40:31 PM

Yes, they will work, because they are PCI-E, so is the mobo.

There also will be Crossfire for their next-gen card. I think they'll work with current Crossfire mobos as well.
January 5, 2007 12:53:36 PM

Good! I thought I might have had to dish out loadsa cash for a new mobo etc. Will the new cards / DX10 have any requirements (e.g. a certian type of CPU? Mine is Athlon X2 4400)?
January 5, 2007 1:28:01 PM

no they will not have specific hardware requirements that you don't already meet now.

The only thing could be a PSU. For everything else, no problem
January 5, 2007 2:10:50 PM

Quote:
So, I guess if they were to come out with new ATI DX 10 cards, they still would be using the crossfire in x8 x8???


They use both now, and likely will support both then too.
Xfire like SLi can be 16x, 8x, and even 4x.

Quote:
*sighs* Within the next year or two... there will be PCI Express Gen 2... x.x

Do we really need this? :roll:


Yes, because PCIe 2.0 will also add 75 watts across the bus doubling the power to the card to 150W, meaning a GF8800 and R600 would only need 1 six pin connector, not 2.

Everyone thininks it's only about speed, but there's more to this stuff than just throughput.

If they had jumped to PCIe 150W from the start they would've been fine with the GF7 and X1K series card with no power connector, now I doubt they can do 225-250W without some serious power management issues on the MoBo, especially with multiple cards.
January 5, 2007 2:12:17 PM

Thank you for the clarification.
January 5, 2007 4:05:35 PM

The OP's mobo has the crossfire 3200 chipset - which can support 16x each pci-e slot in crossfire, and i'd hope they'll support the r600 in crossfire, i have the delux version of that mobo :lol: 

The question i want answering is will we have enough room in our cases for 2 of the r600's AND a pci sound card?? I hear the card may be a little "fat"...
January 5, 2007 6:01:02 PM

Quote:
I thought PCI Express Gen 2 is come out in 4 quarter of 2007. Along with HyperTransport 3.0

IF they doing away with the need for a 'mastercard'. Do you mean? No need for two card. So 2 cards in 1?


Crossfire 2 is already out - instead on needing a master and slave card, connected with a dongle, the crossfire 2 cards can just have 2 normal cards with the same gpu plugged in and run in crossfire - no dongle needed, no master card needed. Some of the x1650's and i think x1950 pro's (?) use this.

You still need 2 cards for crossfire - that's what crossfire is, it's just newer cards don't need a master/slave setup - just 2 normal cards (the ATI R600 GPU coming out soon will have this i believe).
January 5, 2007 8:53:40 PM

Please use normal font...
January 6, 2007 4:10:59 AM

word. I just ignore awkward postings with stupid fonts.

I am waiting to upgrade to C2D until the official (or at least credible) R600 v. G80 benchmarks come out. I forsee getting a large display, so I want to setup the upgrade path (just in case, may not even use it lol).
January 7, 2007 8:49:33 AM

If they made the GPUs 65nm instead of 90nm thats alot of power saved there.
and if they used smaller memory too thats even more power saved.
January 7, 2007 9:49:37 AM

It's not as simple as you put it.

If it would, everything would be 65nm the moment it came out!
January 7, 2007 4:19:58 PM

Quote:
If they made the GPUs 65nm instead of 90nm thats alot of power saved there.
and if they used smaller memory too thats even more power saved.


I thought ATI's new GPU's were moving to 80nm production process, not 65nm. Am I misinformed?
January 7, 2007 4:45:50 PM

Ok, cool. Thanks. I was just making sure I had things straight in my head... which doesn't always work lol.
January 8, 2007 11:39:20 PM

Hmmmm, I finally see a good reason why AMD and ATI bothered to merge... Maybe AMD can help ATI move to 65 nm and get ahead of Nvidia for a while. Guys, would it be a good thing if Intel bought Nvidia, or not, what do you think?
January 8, 2007 11:45:04 PM

I honestly don't think Intel will, too much cannibalization of their own sales. Intel already makes a worthwhile IGP. To purchase nVidia would introduce significant overlap in a very high volume segment of the market. While they could integrate the two, I am just not sure it is a great idea. I think because Intel is in the middle of trimming excess fat due to their 2 year hiring spree, the cost and headaches caused by an acquisition would be more than they need right now. Now a joint-venture may crop up in an effort to tackle and counter AMD's Fusion, but that is about all I would expect. It is also possible that Intel could do it entirely solo since they make IGP's and CPU's already on their own at their own facilities. They don't technically need nVidia to counter fusion, but may lean on them for their GPU expertise in the discrete arena (who knows why lol).
!