Sign in with
Sign up | Sign in
Your question

R600 Geared towards DX-10

Last response: in Graphics & Displays
Share
April 30, 2007 5:48:57 PM

http://www.fudzilla.com/index.php?option=com_content&ta...

pretty much what i thought.... people want to call the r600 a bust based on one sites tests what are dubious and no DX-10 scores in vista

If your buy vid cards like once every 2 years as i do.....you can see now important it is to know your vid card you buy today will have some power to run the dx10 games later...

More about : r600 geared

a b U Graphics card
April 30, 2007 6:14:18 PM

Anyone calling the R600 a bust over the DT benchmarks and rumors is silly IMO. But really IMO DX9 performance in todays games is probably more important than tomorrows DX10 games in Vista. I see your point, but the games that will convince us to go DX10/Vista are not here, and there will be GPU refreshes out by the time they are. Still if R600XT offers better DX9 performance now than the 8800GTS 640MB it could still be a good buy. If it then offers better DX10 performance than the 8800GTX it would end up being a very good buy. But it has to compete price/performance-wise right now or it's not going to earn much appeal.
April 30, 2007 6:37:31 PM

Quote:
Anyone calling the R600 a bust over the DT benchmarks and rumors is silly IMO. But really IMO DX9 performance in todays games is probably more important than tomorrows DX10 games in Vista. I see your point, but the games that will convince us to go DX10/Vista are not here, and there will be GPU refreshes out by the time they are. Still if R600XT offers better DX9 performance now than the 8800GTS 640MB it could still be a good buy. If it then offers better DX10 performance than the 8800GTX it would end up being a very good buy. But it has to compete price/performance-wise right now or it's not going to earn much appeal.


myself... i will wait for the refresh...the r650 or g90

but still.... dx-10 performance is a factor ... imagine the person who gets 8800gtx thinking he will get top performance in crysis....based on dx-9 performance....and then the r600 core lays the smack down in it..

because we dont know... crysis may not be playable at top REZ with full eye candy on a 8800GTX but smooth on a 2900xt
Related resources
April 30, 2007 7:14:14 PM

I think i'm going to get the 2900 xt if it really is as low as the gts in price, I don't expect its price to drop that much even when the xtx comes out. Hopefully it will really shine in directx 10.
April 30, 2007 7:19:12 PM

I'm not completly sure about that smack down because in their statement to the press "The 2900xt is won't be able to compete with the 8800 gtx" the end of that sentence would have went, "right now until DX10 comes out and we'll give it a 'smack down' "
MO
April 30, 2007 7:28:30 PM

Fudzilla is saying the 2600's are delayed till June because the A12 samples has various issues
April 30, 2007 7:42:15 PM

Quote:
Anyone calling the R600 a bust over the DT benchmarks and rumors is silly IMO. But really IMO DX9 performance in todays games is probably more important than tomorrows DX10 games in Vista. I see your point, but the games that will convince us to go DX10/Vista are not here, and there will be GPU refreshes out by the time they are. Still if R600XT offers better DX9 performance now than the 8800GTS 640MB it could still be a good buy. If it then offers better DX10 performance than the 8800GTX it would end up being a very good buy. But it has to compete price/performance-wise right now or it's not going to earn much appeal.


I'd agree: even if the preliminary DT benches prove to be entirely accurate, R600 still performs between an 8800 GTS and 8800 GTX. AMD/ATI owns the $375 - $500 price segment in this case (I expect 2900XT will hover around $425-450, at least until late July). I don't really care if R600 performs better in DX10 since I'll be upgrading by the time the games become prevelent. Nonetheless, it's still pretty disappointing to see ATI release a card 6 months late that does not outperform G80.
April 30, 2007 8:02:08 PM

Quote:


but still.... dx-10 performance is a factor ... imagine the person who gets 8800gtx thinking he will get top performance in crysis....based on dx-9 performance....and then the r600 core lays the smack down in it..


Personally I knew buying the 8800gtx that DX10 performance is an unknown. Only because it stepped up as being a good card for everything I can play did I buy it. I'm guessing this is the common logic. Had the card been released as a so so performer on current titles with the promise of Dx10 performance then you could laugh at us for buying it.

My long term plan is simple, if it sucks at Dx10 I throw it in one of my older computers and buy again.
a b U Graphics card
April 30, 2007 9:12:08 PM

Quote:
But really IMO DX9 performance in todays games is probably more important than tomorrows DX10 games in Vista.


True as long as there isn't a way to express the benefit, then there essentially is no benefit of A>B.

that's one of the reasons I was thinkng there was no rush on the R600 for this very reason, with AMD hoping by now something would come along to show even the supposedly relatively weak HD2900XT as having a discernable benefit over the GTX. They may need to wait until 1-2 months into sales for that, which kinda sucks for AMD because it means less cards sold close to the launch premium price, and potentially a moving what would've been easy to sell as a $500 card at $400 instead.

Quote:
I see your point, but the games that will convince us to go DX10/Vista are not here, and there will be GPU refreshes out by the time they are.


Well that depends, FSX should hit with SP1 before the end of June (unless there's another beta bug found [3rd Beta now]), and the Crysis demo is rumoured June/July, so that would be before a refresh from either, but maybe not enough to be 'conclusive' of anything for anyone.

Quote:
Still if R600XT offers better DX9 performance now than the 8800GTS 640MB it could still be a good buy. If it then offers better DX10 performance than the 8800GTX it would end up being a very good buy. But it has to compete price/performance-wise right now or it's not going to earn much appeal.


Exactly, just look at the GF8600 series. And while I say it's over priced I think I look at it more knowledageably than many consumers, and still do give it credit for DX10, just no too much or too little IMO.

I suspect that by the time we know the better architecture, that the refresh will focus on fixing some of the flaws in the lesser one.
a b U Graphics card
April 30, 2007 9:18:17 PM

Quote:
Fudzilla is saying the 2600's are delayed till June because the A12 samples has various issues


Don't see how that's relevant to the discussion.
Care to explain? :?:
April 30, 2007 9:28:03 PM

I'm also waiting for a refresh (r700, g90).
The road to DX10 and Vista 64 still seems rocky,
I expect those cards to have much higher performance and all ironed out.

I was really hoping the 2900XT is the next 9700 Pro. Just one more year :roll:
a b U Graphics card
April 30, 2007 9:44:15 PM

Quote:
True as long as there isn't a way to express the benefit, then there essentially is no benefit of A>B.

Yeah, and that's exactly the way I meant my comment to be taken. If it were September and we had a few big DX10 titles out, then Vista/DX10 performance would IMO be worth more than DX9 Win XP benchies.

As far as FSX, I guess it just doesn't appeal to me. But a Crysis demo that early is news to me. (sweet, it's good news too).
April 30, 2007 9:47:03 PM

Quote:
I'm not completly sure about that smack down because in their statement to the press "The 2900xt is won't be able to compete with the 8800 gtx" the end of that sentence would have went, "right now until DX10 comes out and we'll give it a 'smack down' "
MO


Uh, yea..., hate to break it to you fans but the 8800GTX is geared for DX10 also..., it just happens to do DX9 better than ATI. Its whole architecture was designed for DX10 just like 600...,
April 30, 2007 10:04:22 PM

It's only a matter of time till people realize that ATI is not full of s***,and can produce a video card capable of competeing directly with NVIDIA.And although DX9 performance is somewhat of a dissapointment,we really need DX10 games and benchies to truly put bothe cards through their paces.Then and only then will we have a true comparison.That my friends,is why I am waiting a few more months.

Dahak

AMD X2-4400+@2.6 TOLEDO
EVGA NF4 SLI MB
2X EVGA 7950GT KO IN SLI
4X 512MB CRUCIAL BALLISTIX DDR500
WD300GIG HD/SAMSUNG 250GIG HD
ACER 22IN WIDESCREEN LCD 1600X1200
THERMALTAKE TOUGHPOWER 850WATT PSU
COOLERMASTER MINI R120
3DMARK05 13,471
April 30, 2007 10:21:04 PM

Quote:
bruce555 wrote:
I'm not completly sure about that smack down because in their statement to the press "The 2900xt is won't be able to compete with the 8800 gtx" the end of that sentence would have went, "right now until DX10 comes out and we'll give it a 'smack down' "
MO



Uh, yea..., hate to break it to you fans but the 8800GTX is geared for DX10 also..., it just happens to do DX9 better than ATI. Its whole architecture was designed for DX10 just like 600...,



I wasn't saying that the 8800 gtx sucked. Obviously look at my investment. All I was getting at in my statement was that ATI doesn't seem to have as much confidence as all the ATI fans that the 2900 xt will destroy in DX10.
April 30, 2007 10:59:02 PM

Quote:
I'm also waiting for a refresh (r700, g90).
The road to DX10 and Vista 64 still seems rocky,
I expect those cards to have much higher performance and all ironed out.

I was really hoping the 2900XT is the next 9700 Pro. Just one more year :roll:


And I have been waiting for 2 years already. :oops: 
Been busy though ..So it's all good. :wink:
April 30, 2007 11:34:01 PM

it's looking like the initial benches at launch won't do ati's latest efforts much justice in future proofing terms, and even if we could peer into the future of directX gaming, it's not too likely the 2900xt will surpass the 8800gtx; however the margins look to be mitigated by fall
April 30, 2007 11:43:46 PM

Quote:
myself... i will wait for the refresh...the r650 or g90


I'm thinking about waiting for the refresh too, unless they actually make a 256-bit 8600. I might actually take a gamble on that card.

Edit: I first wrote 256MB 8600, don't ask me what I was thinking....
a b U Graphics card
May 1, 2007 12:05:14 AM

Quote:

Uh, yea..., hate to break it to you fans but the 8800GTX is geared for DX10 also..., it just happens to do DX9 better than ATI. Its whole architecture was designed for DX10 just like 600...,


We still don't know that yet, no more than you 'know' whether or not nV is better than AMD at DX9.

If we go by rumour and previews alone then the GTX is must faster at DX9, and the XTX is much faster at DX10. Oh and the GTX's geometry shaders are so weak as to make one of the FSX devs comment on how they will have to consider that weakness when making the DX10 SP1 version.

Personally I'll wait to see the finaly shipping product before determining what is what. I think that's usually the best plan, especially considering the number of underwhelming products lately.
a b U Graphics card
May 1, 2007 12:07:48 AM

Quote:

I'm thinking about waiting for the refresh too, unless they actually make a 256-bit 8600. I might actually take a gamble on that card.


IMO, if the HD2600 series or HD2900XL don't impress you for the money, then the best bet would be the GF8800GTS-320, unlikely to get such a good deal near-term, and long term you're just missing out on a good card, which you could easily sell and upgrade later.
May 1, 2007 12:37:36 AM

Quote:
If we go by rumour and previews alone then the GTX is must faster at DX9, and the XTX is much faster at DX10. Oh and the GTX's geometry shaders are so weak as to make one of the FSX devs comment on how they will have to consider that weakness when making the DX10 SP1 version.
Can you link to that blog?
a b U Graphics card
May 1, 2007 1:12:17 AM

http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimi...

We really want to see final ATI hw and production quality NV and ATI drivers before we ship our DX10 support. Early tests on ATI hw show their geometry shader unit is much more performant than the GS unit on the NV hw. That could influence our feature plan."

So if you just go by rumours alone anyone who says the G80 owns in DX can equally understand that if M$ needs to 'rethink their feature plan to accomodate the G80's underperforming geometry shaders', that it may not be up to the task of one of the most important parts of DX10 and consider what that means to whom owns in DX10.

Like I said personally I'll wait until it matters, and until we have something more solid to go on before I declare winners/losers .
May 1, 2007 3:23:51 AM

nice find...may be r600 was developed with just DX10 in mind...and g80 (though it is DX10 architecture) with more of DX9 in mind...
May 1, 2007 3:57:50 AM

And why not, nVidia was smart enough that they new if they could get something out there that was atleast dx10 compatible they could ride the dx10 wave, knowing full well they would have more than enough time before more than 1-2 dx10 games were even out.
On top of all that it just seemed to make sense that since it was dx10 compatible it would be an amazing dx9 card. Well what actually could hold true is that it is a fantastic dx9 card but slightly worse dx10 card. ATi on the other hand, typically slow with there releases, was not able to react fast enough like nV, because of this, they've been developing a very true dx10 card, so they keep delaying there new card, cause it underperforms in dx9 until we can get a very real dx10 benchmark because that is where this card should shine. Fortunately for nV they've had the time to watch the 8800GTX/s absolutely dominate the high end segment, and while all this is happening are developing a more "true" dx10 card.
May 1, 2007 4:00:59 AM

You want to ground that statement?
May 1, 2007 4:54:28 AM

Quote:
You want to ground that statement?


he said clearly " I think" wich means he thinks, and thus speculation.
do someone whos having an opinion or just an idea of the whys, need "ground"? o_O
a b U Graphics card
May 1, 2007 5:56:22 AM

Quote:
And why not, nVidia was smart enough that they new if they could get something out there that was atleast dx10 compatible they could ride the dx10 wave, knowing full well they would have more than enough time before more than 1-2 dx10 games were even out.


That's fine if true for the G80, but for the GF8600, it would be a death blow IMO. Main thing is ifr the GF8600 launched early that thinking would be fine too, but any weakness like this would lose one of the few benefits the GF8600 need to compete with last generations values.

It's better than what we originally expected in a hybrid G80, but it does make the GF80 more of the 'FX' of this generation than the R600 as is often mentioned.

Quote:
ATi on the other hand, typically slow with there releases, was not able to react fast enough like nV, because of this, they've been developing a very true dx10 card,


The R600 is designed very similarly to the R500/Xenos found in the Xbox360, so it's not like ATi suddenly decided to make a DX10/Unified part after the G80 came out, the DX10 compliance level and PS/VS/GS support has pretty much been set in stone for months and months before the G80 came out.

Quote:
Fortunately for nV they've had the time to watch the 8800GTX/s absolutely dominate the high end segment, and while all this is happening are developing a more "true" dx10 card.


When is this mythic card coming out? No time soon.

The GF8800Ultra is the same design, and it'll be up to the G90 to imrpove on things when it launches in fall/winter; but the question becomes whether they knew it was going to be weak in Geometry shaders far enough back to plan out an alternate route all along. Whatever they have ready to tape out in just a few weeks for the next round is planned, there's little chance they're making a last minute change unless the G90 gets pushed into 2008.

So either they knew the weakness in advance and accepted it as a good interim part for the end of DX9 and the beginning of DX10, with a strategy to change the focus and refine regardless of the R600 that appears, this would be similar to the GF7900 which did more than just refine the GF7800. Or they thought the stream strategy would be powerful enough and with stuff like their early FPS creator work located the weaknesses and planned the replacement early into the G80's life.

Of course it's still early to make any judgments without any good tests and the two architectures to compare.
May 1, 2007 6:07:26 AM

Quote:
I was really hoping the 2900XT is the next 9700 Pro. Just one more year :roll:

Its looking more and more like a new FX5800, wouldn't be suprised if it is...
May 1, 2007 7:06:23 AM

For me, the DX10 performance is far more important than DX9 performance. I could careless if your $550.00 videocard does 200 FPS in Doom3 while my $550.00 videocard does 190 FPS in Dooms 3. Both cards will do fine in just about all DX9 applications to the point that you won't notice the difference if you turn off the frame rate counter. Also, I'm not really building my current computer to play older games. Except for FSX, and maybe CIV IX, I usually finish a game and all the levels, and then put it away. This is especially true of First Person Shooters (the usual DX9 benchmark). So far, the 2900XT looks a little better in DX9 games and maybe a whole lot better in DX10 than the GTS. It seems like an attractive option. However, if the 2900 XTX were available and affordable when I was choosing the graphics card for my new build, and those DT benchmarks were accurate, but I had some confidence DX10 would favor the 2900 XTX, I would still choose the 2900 XTX.

Rob
May 2, 2007 8:32:37 PM

The new mythic card doesn't have to come out anytime relatively soon as there aren't going to be a but 10 or less DX10 games out before nV will be looking at introducing a new graphics card.

Simply put all i am saying is that it seems nV took a very easy approach to boost DX9 performance while satisfying DX10 capability so they could cash in on the DX10 'wave' and "hopefully" they were being a little more sly than we know and we're planning a much more Dx10 focused card sooner rather than later. (at the rate each new generation goes before ATi's next gen)

Also regarding the 8600s i've heard they had some core improvements, also is it not their memory bus that is more the limiting factor rather than the core. Correct me if i'm wrong but also isn't the core the 8600s running very fast,( i see the GTS runs between 600mhz-720mhz) which is faster than the 8800GTX. Not sure how this would effect performance but it could play the role of the 8600GTS offers a better performance gain in Dx10 over Dx9 than the 8800GTX.
Anonymous
May 2, 2007 9:32:49 PM

R600 is a power sucking monster compared to 8800GTX. this is the biggest con of it. and performence might be 10% more then 8800GTX but power consumption is way more then that. which really blows.
May 2, 2007 10:06:03 PM

Quote:
R600 is a power sucking monster compared to 8800GTX. this is the biggest con of it. and performence might be 10% more then 8800GTX but power consumption is way more then that. which really blows.


Got Links?
May 2, 2007 10:17:36 PM

Quote:
R600 is a power sucking monster compared to 8800GTX. this is the biggest con of it. and performence might be 10% more then 8800GTX but power consumption is way more then that. which really blows.


Got Links?

A lot of sites today are quoting this chart. It should not be trusted as it is provided by Nvidia.

While we are asking for links though can I ask for links for this:
Quote:
Simply put all i am saying is that it seems nV took a very easy approach to boost DX9 performance while satisfying DX10 capability so they could cash in on the DX10 'wave' and "hopefully" they were being a little more sly than we know and we're planning a much more Dx10 focused card sooner rather than later.


And this:

Quote:
Not sure how this would effect performance but it could play the role of the 8600GTS offers a better performance gain in Dx10 over Dx9 than the 8800GTX.
Anonymous
May 2, 2007 10:24:41 PM

i totally agree with you. but the diffrence is just to high that even if you knock out 30 watts it's still sucks way more power which lot of websites say ATI is famous for. it's pretty simply R600 needs one 8 pin and one 6 pin power connector as compared to 8800 GTX two 6 pin power connectors. so no matter what power consumption will be higher.
a b U Graphics card
May 2, 2007 10:40:13 PM

Quote:
The new mythic card doesn't have to come out anytime relatively soon as there aren't going to be a but 10 or less DX10 games out before nV will be looking at introducing a new graphics card.


Yes but many of those 10 will be the major titles of 2007, like Crysis and UT2K7. To say they matter like would be like saying it didn't matter that the FX series couldn't play HL2 well enough. Many of the people buying card now are looking at holding them for a long term, like those who are only now upgrading their R9700/9800 or GF6800. Noth everyone is on a 12 month renewal strategy, and for those people it won't matter because if the R600 is the better card in Crysis then they'll sell their G80 and buy an R600, and vice versa.

Quote:
Also regarding the 8600s i've heard they had some core improvements,


nV bumped up the texture addess unit to 1:1 ratio from 1:2, whereas the R600 is 2:1. The effect of which is unknown sofar, but interesting the direction it went.

Quote:
also is it not their memory bus that is more the limiting factor rather than the core.


It's both, the memory is just the most glaring issue that people had commented on long before the part finalized. However the shader core is still much weaker.

Quote:
Correct me if i'm wrong but also isn't the core the 8600s running very fast,( i see the GTS runs between 600mhz-720mhz) which is faster than the 8800GTX.


About 10-20% faster, but with 25%-33% of the Stream procs.

Quote:
Not sure how this would effect performance but it could play the role of the 8600GTS offers a better performance gain in Dx10 over Dx9 than the 8800GTX.


Nah the impact will be neglieable, the statements being bandied about on the DX10 weakness side involve factor of, not minor percentages. The overall performance difference should about to percentages of since a game isn't going to be made up of those singular weaknesses.
Anonymous
May 2, 2007 10:46:23 PM

Nvidia and ATI will need to learn from AMD and INTEL and start controling power consumption soon. like they both did.
a b U Graphics card
May 2, 2007 10:49:28 PM

Quote:
here's a link
R600 needs one 8pin and one 6pin power compared to 8800gtX which needs 2 6pins rest you can figure out.


If that's all I need in order to figure it out then that means that the G80 draws 225W max, and the R600 250W max, since the calculation is 75W for the PCIe slot, then 75W per 6pin, and 100W per 8 pin.

If that's all we need to figure out, then why does Xbit see something different, and why does nV's own numbers say something different?

And considering that the 8 pin is said to be optional for ovrclocking or optional for a PCIe2.0 board, we still don't know what the 8+6 pin is for, nor even if it needs that much juice, especially in retail wehere the fan is 20W leaner than the OEM fan always pictured.

Until it's properly tested there won't be solid numbers on it.

Considering that at one point the G80 was said to be 300W, I don't trust any of the pre-retail BS.

Quote:
it's still sucks way more power which lot of websites say ATI is famous for.


Both the FX5xxx series and GF6800 series drew more power than their ATi counterparts, nV has one efficient high end line and suddenly it's only ATi that's famous for power consumption? Whatever! :roll:

the other thing is while the X1900 may have consumed more power it was also a better performer, and when it comes down to it in the high end, that's all that will matter.
May 2, 2007 10:53:52 PM

Quote:
R600 needs one 8pin and one 6pin power compared to 8800gtX which needs 2 6pins rest you can figure out.


From the article at LegitReviews:
Quote:

The ATI HD 2900 XT has an 8-pin and 6-pin PCIe power header, but only a pair of 6-pins need to be used for normal operation. The 8-pin only needs to be used when overclocking, so right out of the box both of the latest and greatest graphics cards will require a pair of 6-pin connectors


By the way, anybody find it amusing that article says the 8800GTX and Ultra each contain 681 transistors? I'm thinking that's missing the (in millions) note.
Anonymous
May 2, 2007 11:44:49 PM

that's the biggest sweet lie lol i ever heard that it only needs 8 pin just for overclocking. if it needs extra power for overclocking 6 pin would have been fine Overclocking won't need 100 watts by the way tell me where did you read that give me a link?
May 2, 2007 11:53:56 PM

Quote:
R600 is a power sucking monster compared to 8800GTX. this is the biggest con of it. and performence might be 10% more then 8800GTX but power consumption is way more then that. which really blows.


Got Links?

A lot of sites today are quoting this chart. It should not be trusted as it is provided by Nvidia.

While we are asking for links though can I ask for links for this:
Quote:
Simply put all i am saying is that it seems nV took a very easy approach to boost DX9 performance while satisfying DX10 capability so they could cash in on the DX10 'wave' and "hopefully" they were being a little more sly than we know and we're planning a much more Dx10 focused card sooner rather than later.


And this:

Quote:
Not sure how this would effect performance but it could play the role of the 8600GTS offers a better performance gain in Dx10 over Dx9 than the 8800GTX.


???Why do you need links do a theoretical statement? I was simply pondering and saying it "seems" that this is what they've done based on some speculation. I never said that was the exact case...duh..
Anonymous
May 3, 2007 12:02:55 AM

no offence buddy but we were talking about facts not fiction's here. But lets just wait for 14th may and see. just few more days left for it then will talk again till then nice talking to you guys. good discussion
May 3, 2007 12:10:03 AM

Quote:
that's the biggest sweet lie lol i ever heard that it only needs 8 pin just for overclocking. if it needs extra power for overclocking 6 pin would have been fine Overclocking won't need 100 watts by the way tell me where did you read that give me a link?


No but the difference in wattage between a 6 pin and 8 pin as pointed out by TheGreatGrapeApe is only 25W...So yeah potential an overclock could easily come much closer to the limited power output of two 6 pin connectors vs. one 6pin and one 8 pin.

Also this is typical in low end cards that are very near the limit for power for the PCIe slot. Usually they will have a 6 pin connector but don't even need it for operation and its only recommended in case of overclocking or if your card might be unstable. Typically though manufacturers are always on the safe side and take a worst possible scenario so they make sure that the power requirements are well under the amount of power that has to be delivered.
May 3, 2007 12:14:09 AM

Not sure what happens May 14 that has any relevance to my speculation about nV. And for the last how many ever months and everyday on these Forums people speculate just like i am about the performance about future products like the R600 and Barcelona. Not sure what is wrong at all with me speculating that nV might be executing a strategy that would take advantage of the Dx10 craze.
a b U Graphics card
May 3, 2007 12:44:25 AM

Quote:

By the way, anybody find it amusing that article says the 8800GTX and Ultra each contain 681 transistors? I'm thinking that's missing the (in millions) note.


No they are just very big transistors.
It's like the hi fidelity audio trend back towards tubes.

Bigger tranistors means warmer pixels. :twisted:
May 3, 2007 12:55:17 AM

What if it's ATI that's trying to take advantage of the dx10 craze? It's easy to say your card will perform better in dx10 when it can't really be put to a real world test. Perhaps this is just ATI trying to create a market for a card they know won't out perform the gtx? Or, maybe the card doesn't perform well in dx10 but ATI thinks they can buy some time to work on drivers this way? They are in a bit of a dire situation right now and I wouldn't put this past them.

Call it tin foil hattery if you want, but its worth considering; at least untill we see some kind of hard facts one way or the other.

And no, I'm not an nVidia fanboy, I've been using ati for the past 4 years.
May 3, 2007 1:30:14 AM

The reason why ATi does not fit the bill is because ATi is not the one that released their Dx10 WAY ahead of time before even a Dx10 OS was out(Vista) let alone the games to run it. Also ATi has past experience with Dx10 like shaders in the GPU in the Xbox360, so for the most part people knew they would have more of a "true" unified shader. Although i see your point that ATi is bring up another issue that their card will perform much better in Dx10 than nV's cards will. They of course could be trying to counter nV with their

What i meant by nV riding the wave is selling their product that claimed Dx10 capability but in reality it didn't even matter and it sill doesn't matter yet, and now that the dust has settled a little bit and people(except those who were tricked into nV's advertising) have finally realized that they don't need Dx10 just yet and by the time the Dx10 games come out there very well could be a new set of cards released by nVidia and ATi that could perform just as well for a lower price than nVs current cards.

There is no doubt however that the 8800gtx dx9 performance is unrivaled and for people looking for that performance should have gone no where else.
May 3, 2007 1:56:47 AM

Quote:
So far, the 2900XT looks a little better in DX9 games and maybe a whole lot better in DX10 than the GTS.


No it doesnt. They are basically neck and neck in DX9 if you go by the DT benchies.

And as for DX10 gaming benchies do you care to share the link to those? :roll:

Agreed.
a b U Graphics card
May 3, 2007 1:57:13 AM

Quote:

And it could also just as easily be the other way around. FANBOY :roll:


LOL @ the n00b. :lol: 

That was some funny stuff, but if you read the context of the discussion instead of just the quote, you'd understand his point, or maybe not.

And calling him a fanboi is funny cause of our previous discussions where he was more pro-nV.

Thanks for comin' out boy.

Ah the ignorance of n00b.
May 3, 2007 1:58:38 AM

Quote:

???Why do you need links do a theoretical statement? I was simply pondering and saying it "seems" that this is what they've done based on some speculation. I never said that was the exact case...duh..


In that case I apologize, it's just that this wasn't exactly worded as speculation.
Quote:
Also regarding the 8600s i've heard they had some core improvements, also is it not their memory bus that is more the limiting factor rather than the core. Correct me if i'm wrong but also isn't the core the 8600s running very fast,( i see the GTS runs between 600mhz-720mhz) which is faster than the 8800GTX. Not sure how this would effect performance but it could play the role of the 8600GTS offers a better performance gain in Dx10 over Dx9 than the 8800GTX.
!