The Saga Continues? R600XTX Pushed to Q3

NamelessMC

Distinguished
Nov 27, 2005
321
0
18,780
This source and information is pending verification

http://www.fudzilla.com/index.php?option=com_content&task=view&id=661&Itemid=1

R600XTX, the 1024 MB GDDR 4 card, has been pushed to the next quarter. This is just one in a series of ATI's failures, but of course DAAMIT will call this a strategic decision. R600XTX won't see the face of retail / etail stores till Q3 2007.

We know that something went wrong with the samples and that the there were some severe performance problems with the latest batch of GDDR 4 cards. Basically 512 MB DDR3 version with 1600 MHz memory was beating the GDDR4 version with 1024 MB of GDDR 4 memory clocked at 2200 MHz.

That cannot be good as the GDDR4 card is much more expensive to build or sell. The worst part is that there is no any big performance difference between 512 GDDR 3 card and 1024 MB GDDR. The trouble is that 1 GB of GDDR 4 at 2200 MHz costs you a lot of money, so ATI at least wants to save some money. It will launch this card after Geforce 8800 Ultra but remember, don’t expect too much.

My speculation. I think it's true. I mean, there's no valid reason the R600XT was the only card to be released to reviewers and preview sites, other than that the R600XT was the only card to actually do what they wanted it to do.

But is this source really hard to believe?

Before you jump to any conclusions, ask yourself the following-

What has 65nm architecture done for other hardware?

Reduced heat and less power consumption are the only hard facts. There hasn't been a piece of hardware that has utilized 65nm architecture that has done any considerable performance gaps.

The Brisbane is 65nm, it still only over-clocks to 3.0-ish GHZ, albeit it does it at lower temperatures and with lower power consumption. A Windsor accomplishes the same over-clocks, even with 90nm architecture, so really it's the architecture of the chip and not the size that counts.

What has GDDR4 done for other Video-cards?

Look at the 8800 series cards. GDDR3 and they put a considerable beating on the X1950XTX/XT. The only conclusion can be that the GPU specifications and shader/core make more of a difference than the type of memory. I mean look at the 6600GT. It's DDR and it beats 7300GT's with GDDR2.

ATI put too much money on the table for 65nm architecture and GDDR4 without the actual GPU power or architecture to back it up. You can't just have GDDR4 and a 65nm sized GPU and expect it to be great. You have to USE all the ram and all the resources to their full power.

And remember, lack of evidence isn't a valid argument to sources like this. You can't come in here and say, "How do you know the X2900XTX sucks if you haven't seen benchmarks?"

That's just it, ATI won't release any engineering samples and the consensus is from leaked sources and speculation that they aren't releasing engineering samples because the card just sucks.

AMD/ATI is like a camel with his head in the sand, and all the fan boys run around defending them, when they don't even have the tenacity to defend themselves! There's no such thing as a "strategy" to not releasing a sample of your product before it's released.

No business person on the PLANET would commit to it. The argument defending why they haven't put out a sample is tired and dead now, and I'm all too confident of what's going on to be willing to hear an ATI fan boys defense.

There's no valid excuse to this late-coming except that the chip was a failure.
 

cronic

Distinguished
Aug 6, 2005
89
0
18,630
What has 65nm architecture done for other hardware?
Reduced heat and less power consumption are the only hard facts

thats enough in itself. the problem with gfx cards nowerdays is that the consume way too much power. (8800gtx) i think they persponed the 2900xtx because it consumed too much power and ran too hot. that or they had no CPU to benchmark it on properly. just about all the "leaked" info says the r600 has been finished for some time

What has GDDR4 done for other Video-cards?

look at the x1900xtx/x1950xtx. its not a massive improvment but its still jump for just using diffrent types of memery

There's no valid excuse to this late-coming except that the chip was a failure.

thats bs your just another nv fanboy. ATI/AMD have skipped a generation. they have been developing the r600 for a long time (so iv read)

dont just have an opinion on something just because you read it on some website
 
I hope youre not confusing the xtx with the xt. We will soon see what the xt will be alll about. There may not be a reason to release the xtx at all. I would guess that ATI has a competitor in the xt version and itll soon be nVids problem to respond. Reading the entire thing, it looks like the 650 will be out so why a xtx? By the time we can really use that much power, the newer archs will be out from ATI and nVid. No games need a gig as of now, but maybe by the end of the year. And more than likely they wont or be very few indeed. Ill wait for the xt to show before I condemn ATI
 

bfellow

Distinguished
Dec 22, 2006
779
0
18,980
So far from the benchmarks, the 2900XT only is above 8800GTS but not the GTX. Looking at the VGA charts, the 8800 GTX has like 20-50% better performance than the GTS cousin at 1280x1024.
 

tamalero

Distinguished
Oct 25, 2006
1,136
142
19,470
Hmm, you seem to be far more the NV fanboi on Tom's forums than on Anand's. Can you swap that for us? http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2038210&enterthread=y

id say this kid as seriusly case of attention whore syndrome
always triying to be on the spot.. so ill kindly say this to HIM with a simple image.... :

shutupms8.jpg
 

RamboMadCow

Distinguished
Feb 8, 2007
18
0
18,510
So far from the benchmarks, the 2900XT only is above 8800GTS but not the GTX. Looking at the VGA charts, the 8800 GTX has like 20-50% better performance than the GTS cousin at 1280x1024.

You have to remember that 1280x1024 is a very cpu restrictive setting. We don't know if the 2900xt is going to stomp the 8800gtx at a much higher resolution or the other way around.
 

sailer

Splendid
If it makes you feel better: R600 sucks.

Now for the sake of everyone else on the forums: Shut up and go jump off a cliff.

Yes, there are no reputable sites, Tom's or otherwise, doing any comprehensive benches, and the OP wishes to believe that the R600 sucks. Well and good.

To the OP, have a good flight off the cliff.
 
What are you talking about?????
Expect AMD to pull the wraps off its DirectX 10 product line up in mid-May, with value, midrange and high end models to boot. AMD’s flagship ATI Radeon HD 2900-series will have two models at launch – the ATI Radeon HD 2900 XTX and the HD 2900 XT. The ATI Radeon HD 2900 XTX models feature 1GB of GDDR4 memory while the lower HD 2900 XT features 512 MB.
Read it here http://dailytech.com/Article.aspx?newsid=7043
 

Mandrake_

Distinguished
Oct 7, 2006
355
0
18,780
I'm benching the 2900XTX2 tomorrow, that will be interesting.

Pffft. Sif. I'm benchmarking the Geforce 9800 GTX right now. :tongue:

Here are the benchmarks, this is an average covering a wide variety of workloads:

Geforce 8800 GTX: ------
Geforce 9800 GTX: ------------

Remember, you saw it here, complete with benchmarks first! :lol:
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
So far from the benchmarks, the 2900XT only is above 8800GTS but not the GTX. Looking at the VGA charts, the 8800 GTX has like 20-50% better performance than the GTS cousin at 1280x1024.

You have to remember that 1280x1024 is a very cpu restrictive setting. We don't know if the 2900xt is going to stomp the 8800gtx at a much higher resolution or the other way around.

naw, I don't believe the 2900xt will pull a miracle out of its a** at higher resolutions. Its hitting about the right performance for price point specs. as I'm sure ATI intended.

I have seen pictures and specs from credible sites that clearly indicate a large, (similar to 8800GTX) card but with huge power requirements. I'm not an electrical engineer although it was my original major but any electronic equipment that has to draw that much power sustained is going to be prone to massive heat and possibly stability issues unless tolerances and components are very well cooled and high quality. I don't see how ATI will do this and keep the price point lower than Nvidia easily.
 
Interesting speculation but you state the obvious, contradict yourself, and make a number of presumptions based on your own faulty fanboy logic.
...lack of evidence isn't a valid argument...You can't come in here and say, "How do you know the X2900XTX sucks if you haven't seen benchmarks?"...That's just it, ATI won't release any engineering samples...because the card just sucks.
Let me try to understand this, the 2900XTX sucks because ATI won't release engineering samples or is that the 2900XTX sucks because there are no benchmarks?! WTF?!
ATI put too much money on the table for 65nm architecture and GDDR4 without the actual GPU power or architecture to back it up.
Puh-leez. Don't be so presumptious.
...so really it's the architecture of the chip and not the size that counts.
Really?! It's about the uArch? :roll:

In the big picture, pushing back the R600/2900XTX is nothing more than that, pushing back the release. And, to presume it is a reflection on the quality of the technology or success of the product is short term niave thinking. Ultimately pushing back the release will not not determine the success or failure of R600/2900XTX, performance will. It's obvious that you choose not to accept ATI's reasons for pushing it back which leads to threads like this and gives food for speculation and fodder for the fanboys. If anything, threads and speculation like this just fuels the market and creates demand.

Given the fact that only 2-3 DX10 games are slated to be released (Crysis, where are you?) in 2007 with the majority of the titles being released in 2008, who really cares if the 2900XTX is pushed back until Q3 '07.