Sign in with
Sign up | Sign in
Your question

AMD just cant get a break

Tags:
Last response: in CPUs
Share
June 14, 2007 10:37:37 AM



I've had my Sapphire Radeon HD 2900XT a little more than a month. Given the widely acknowledged crappiness of the the initial retail drivers, it's astonishing to realize that Catalyst has released exactly one XP driver update since the product launch 5/9/07: 6.14.10 on 5/31.07. This is a far cry from nVidia, who seemed to be releasing driver updates every week after 8800 launched in November of last year. Yeah, I know it took em forever to to make it Vista friendly, but at least the individual game fixes showed that the nVidia driver development team was working on stuff.

Anyways, the "new" Catalyst drivers don't do any miracles. I'm in total agreement with the article as far as this goes. The graphical glitches are still there across a wide range of apps, as well as the crappy overall perforamance with AA on. Everything about the Catalyst software package sucks. The lack of hardware monitoring (temps, volts) in CCC is atrocious. The dll glitch when running 3DMark06 is laughable considering Sapphire (and others) package the card with 3DMark06. The UVD bs was the icing on the cake.

I'm really saddened by this. Would hate to see nVidia achieve de facto monopoly status in GPUs.
June 14, 2007 10:54:41 AM

What saddens me is that AMD drained its entire war chest to buy ATI. That $5 billion should have been sunk into R&D to accelerate and improve the K10 line.

AMD should have left ATI to scratch a living on its own. At the very least they should have delayed the ATI purchase until Q3 2007 and bought ATI at 1/3 the price they paid for it in 2006.
Related resources
June 14, 2007 11:09:21 AM

i agree 100% with the person above, i never though about it that way...
June 14, 2007 11:54:31 AM

well it's easy to be smart after you've done something. :roll:
June 14, 2007 12:14:32 PM

Quote:
well it's easy to be smart after you've done something. :roll:


Oh I get it ... in late 2006 ATI had no inkling at all that the R600 wasnt going to be competitive with Nvidia :roll:

AMD didnt do any sort of due diligence checking and hence let down their shareholders.
June 14, 2007 12:19:45 PM

is it just me or the HD2900TX's image quality is superior than the G8800GTS 320MB on Oblivion and BF2142 at the same resolution? Although the Oblivion test's second image ATI did not have grass at all but it is 25% grass? why couldnt they tested it on the same level? or does it mean ATI can only show 25% of grass, bit confused :?
June 14, 2007 12:24:37 PM

Maybe they knew that ATI's R600 will be inferior to the 8800 series but they hoped that it can be further improved, etc. and they will do a better deal. We agree on the fact that it would have been a better deal if they spent that money on the K10 line's R&D, but AMD says it was a strategic move so that's all.
June 14, 2007 12:33:15 PM

Right now almost every analyst out there agree that buying ATi was a wise decision for AMD... even if they can't benefit from this merger yet, in the future AMD will be better position to compete in many markets and create great and innovative products.

That being said, I agree with you guys on one thing ... 5,4 bil $ was way to much money for ATI.
June 14, 2007 12:44:32 PM

Buying ATI was the best possible thing AMD could have done. Read a few investment reviews on them.

When AMD first released their aptly named "K(ryptonite)-series" processors, they caught Intel with its pants down.

The next couple generations of AMD processors also managed to deal serious blows to Intel (Athlon, Athlon 64, etc)

However it was becoming more and more apparent that they were waking a sleeping giant.

Today, Intel is dominating the market once again. Things are looking pretty bad for AMD right now.

This said, AMD needed to diversify into a market not yet explored by Intel...this ended up being discrete graphics.

Without the purchase of ATI, AMD wouldn't have any choice but to pack their bags...Intel has them outfunded, outmanned, and outgunned.

It may have cost them an arm and a leg...and more than a little equipment 8) but in the long run, it was worth it.

(BTW, this is out of my head, not quoted from anywhere)
June 14, 2007 12:47:07 PM

It IS a strategic move: Why sink only one company when you can sink two?

However, I thought that the 2900XT could at least be on par with the 320MB version but I never expected the R600 to be so much POS.
June 14, 2007 12:58:06 PM

Quote:
is it just me or the HD2900TX's image quality is superior than the G8800GTS 320MB on Oblivion and BF2142 at the same resolution? Although the Oblivion test's second image ATI did not have grass at all but it is 25% grass? why couldnt they tested it on the same level? or does it mean ATI can only show 25% of grass, bit confused :?


That is exactly the problem with the 2900xt. To get comparable image quality and FPS compared to the 8800 cards you have to play at lower settings. That is basically what [H] has been saying all along, to look at not just the FPS but the IQ as well.
June 14, 2007 1:02:41 PM

Lets not forget, that AMD paid for ATi not only in cash, but also in AMD shares, which were expensive at that time.
And the HD2900- well, some reviews are very happy about the new drivers, and start recomending the card. Also the ATi mainstream GPUs are cheap to produce, so there is hope
a c 97 à CPUs
a b À AMD
June 14, 2007 1:10:06 PM

Quote:

I've had my Sapphire Radeon HD 2900XT a little more than a month. Given the widely acknowledged crappiness of the the initial retail drivers, it's astonishing to realize that Catalyst has released exactly one driver update since the product launch 5/9/07: 6.14.10 on 5/31.07. This is a far cry from nVidia, who seemed to be releasing driver updates every week after 8800 launched in November of last year. Yeah, I know it took em forever to to make it Vista friendly, but at least the individual game fixes showed that the nVidia driver development team was working on stuff.

There were at least 2 Catalyst release candidates (beta RC 2 and RC 7) for XP and Vista.

8.38 RC2's (released on the 14th)
8.38 RC7's (released on the 22nd)

And I hasten to add that since the release of the 2900 XT the price of the card has gone up. . .

And the prices for the Geforce 8800gts & gtx has come down. . .

And the price for the 2900 XT is right in the middle between the two . . . . :D 
a b à CPUs
June 14, 2007 1:30:46 PM

I am a little skeptical of HardOCP's reviews. Especially their propensity for running comparative hardware at different settings. When you are trying to compare (benchmark) two different architectures, you keep all settings the same and look at FPS. You don't adjust settings to keep the FPS the same. Their approach is what a user would do to run the game smoothly, but it's not what a benchmarker is going to use to illustrate performance differences. I have seen HardOCP setting up comparatives before that were skewed to fit their agenda (namely the C2D gaming benchmarks). I read their reviews, but I always look objectively at how they have set up the benchmarks.

That all said I found this review:

LINK

They use different games to benchmark, and it shows a completely different picture. To summarize, the X2900XT kicks the 8800GTS 320MB in the nads, and falls between the 8800GTX and the 8800GTS 640MB performance wise. Funny how two different websites can produce two totally different conclusions.
June 14, 2007 1:44:36 PM

sh*t happens i guess
June 14, 2007 1:56:05 PM

I hope they have a good mop cause I am getting one of these soon.
June 14, 2007 2:22:00 PM

Quote:
I am a little skeptical of HardOCP's reviews. Especially their propensity for running comparative hardware at different settings. When you are trying to compare (benchmark) two different architectures, you keep all settings the same and look at FPS. You don't adjust settings to keep the FPS the same. Their approach is what a user would do to run the game smoothly, but it's not what a benchmarker is going to use to illustrate performance differences. I have seen HardOCP setting up comparatives before that were skewed to fit their agenda (namely the C2D gaming benchmarks). I read their reviews, but I always look objectively at how they have set up the benchmarks.


Frankly, I find HardOCP's "benchmarking" to be represent the most realistic use of hardware. I couldn't care less if my 8800GTS can run a game at 370FPS [maxed out] and my 6800nu can run the same thing at 70FPS [medium settings]; both are above my 60FPS threshold. However, the 6800nu can't run the game with the same visual quality as the 8800GTS, and that is a tangible drawback of using the 6800nu to play games.

Quote:

That all said I found this review:

LINK

They use different games to benchmark, and it shows a completely different picture. To summarize, the X2900XT kicks the 8800GTS 320MB in the nads, and falls between the 8800GTX and the 8800GTS 640MB performance wise. Funny how two different websites can produce two totally different conclusions.

I didn't see the 8800GTS 320M in there anywhere.
June 14, 2007 2:27:28 PM

Quote:
I am a little skeptical of HardOCP's reviews. Especially their propensity for running comparative hardware at different settings. When you are trying to compare (benchmark) two different architectures, you keep all settings the same and look at FPS. You don't adjust settings to keep the FPS the same. Their approach is what a user would do to run the game smoothly, but it's not what a benchmarker is going to use to illustrate performance differences. I have seen HardOCP setting up comparatives before that were skewed to fit their agenda (namely the C2D gaming benchmarks). I read their reviews, but I always look objectively at how they have set up the benchmarks.

That all said I found this review:

LINK

They use different games to benchmark, and it shows a completely different picture. To summarize, the X2900XT kicks the 8800GTS 320MB in the nads, and falls between the 8800GTX and the 8800GTS 640MB performance wise. Funny how two different websites can produce two totally different conclusions.



Yeah, that is a night and day difference Even with some of the same tests. That's why I put less into perf numbers for benchmarks. I use reviews to see how bad my pocket has to hurt to play my games and run my productivity apps well.

Right now all ofthe DX10 GPUs suck for either power, drivers or features. I can't upgrade to Vista because nVidia still doesn't have real multimon support for it.

I can see that I'll be sitting on my 4400+/7800GT until maybe XMas.
June 14, 2007 2:49:22 PM

Both Tweaktown, HardOCP, and every review on the web has stated AA/AF significantly kills the 2900XT's performance.

HardOCP does explain why (like in Stalker for the GTS320 and BF2142 for the 2900XT ) it lowers the settings on some or higher for others if you ever read the reviews.
June 14, 2007 2:55:40 PM

Quote:
Both Tweaktown, HardOCP, and every review on the web has stated AA/AF significantly kills the 2900XT's performance.

HardOCP does explain why (like in Stalker for the GTS320 and BF2142 for the 2900XT ) it lowers the settings on some or higher for others if you ever read the reviews.



Did I say anything about AA/AF? I said the reviews show a totally different picture.

I guess if I didn't read them I wouldn't know that, huhn?
June 14, 2007 3:58:42 PM

Quote:
Buying ATI was the best possible thing AMD could have done. Read a few investment reviews on them.


I disagree. It was completely wrong move given the circumstances. They needed to secure their position on the top, instead of spreading themselves thin so much.

Quote:
Without the purchase of ATI, AMD wouldn't have any choice but to pack their bags...Intel has them outfunded, outmanned, and outgunned.


That is not true, they would have cash to speed up R&D and production. Not to mention that engineers would stay focused on the work instead of worrying whether they will get to keep their job or get axed in a merger.

Quote:
It may have cost them an arm and a leg...and more than a little equipment 8) but in the long run, it was worth it.


That move still has to prove its worth in the long run and I find that higly unlikely. Both AMD and ATI have good hardware engineers. Problem is that both companies have terrible software engineers. So they can make great hardware and all, but without proper software to drive it, their advantage melts out.

I am also under impression that those hardware engineers don't give a damn about what software engineers have to say about their designs. That may even be justified given that those software engineers are clearly not the best, but that kind of ignorance obviously leads to leaving out certain features that software community clearly needs and putting in those which at best can be described as redundant into the final product. Clearly not a good way to do business.

Last but not least, if their other divisions were so productive and creative as their PR, they would have clearly accomplished much, much more.
June 14, 2007 4:27:41 PM

Quote:
Buying ATI was the best possible thing AMD could have done. Read a few investment reviews on them.


I disagree. It was completely wrong move given the circumstances. They needed to secure their position on the top, instead of spreading themselves thin so much.

Quote:
Without the purchase of ATI, AMD wouldn't have any choice but to pack their bags...Intel has them outfunded, outmanned, and outgunned.


That is not true, they would have cash to speed up R&D and production. Not to mention that engineers would stay focused on the work instead of worrying whether they will get to keep their job or get axed in a merger.

Quote:
It may have cost them an arm and a leg...and more than a little equipment 8) but in the long run, it was worth it.


That move still has to prove its worth in the long run and I find that higly unlikely. Both AMD and ATI have good hardware engineers. Problem is that both companies have terrible software engineers. So they can make great hardware and all, but without proper software to drive it, their advantage melts out.

I am also under impression that those hardware engineers don't give a damn about what software engineers have to say about their designs. That may even be justified given that those software engineers are clearly not the best, but that kind of ignorance obviously leads to leaving out certain features that software community clearly needs and putting in those which at best can be described as redundant into the final product. Clearly not a good way to do business.

Last but not least, if their other divisions were so productive and creative as their PR, they would have clearly accomplished much, much more.


AMD HAD TO buy ATi for the chipsets and Fusion tech. They could license it or try to hire engrs from them but that might result in patent infringement. If anything the "mistake" was to not push the chipset division to get a reference HT3 chipset out for servers.


That is one of the reasons they got Toshiba for laptops, they can now provide a whole platform to OEMs, including CPU, GPU, and PCIe.

They may lose some high end sales but low end is where the volume is. For the longest time Sempron was the most popular chip in retail desktops. Now it's the 4200+ (BE2400?).

With the Turion now at 65nm, they can make a whole helluva lot more than they can Opteron 285 or X2 6000+.

I guess we'll see what the desktop numbers look like. Also, the HD2400/2600 series is getting 60% of OEM orders for low end/HTPC boxes.

They just need to have enough of them to go around, whch they should as USMC and TMC are both at 65nm and ready for 55nm.

If they can get a high end part on 55nm they can clock several hundred MHz higher at the same TDP.

Hopefully they are doing the right things with the money from those convertible bonds. he right thig right now is getting as much of Fab 30 to 65nm as possible so Fab 36 can do more Barcelona derivatives while Fab 30 does Opteron, some 90nm Turion and the rest Brisbane.

From what's being said after Computex, ATi will do a good volume business and maybe actually turn a profit. If I were AMD I would still reserve a little space for Intel chipsets. why not kick em in the nuts? All's fair in love and war.
June 14, 2007 4:46:36 PM

Buying ATI was not about the R600, it is about the future of AMD and being able to compete with Intel across the board.

Buying ATI was probably the best move AMD could have made, whether it was finanically viable (although at the time the outlook was not as poor as what it is now) as it does or does not appear to be now it was and still is an excellent match.

AMD need to be able to compete in markets where they had no presence, and they are thinking beyond the K10/R600, something that most people appear blind to (at least on this forum). The high end enthusiast market is a dead end, the mainstream low/mid range market with integrated chipsets (including video) and the Server markets have to be exploited if AMD is to have a future and that is what they are aiming at. Opteron was a good first shot the server market (low volume, high value) and they obviously hope K10 will continue that (it is not and never has been a desktop chip). Fusion will hopefully be a good entry into the low/mid integrated arena (low profit but extremely high volume).

The ATI purchase fits into the roadmap perfectly, providing them with an excellent graphics product line (again, it only has to be better than the current market leader in the arena they wish to contest and that is not NVidia but Intel), ATI are in a strong position to compete against Intel and I wish them the best of luck, not least because I have invested in their stock (which was a steal).
June 14, 2007 5:07:14 PM

That's what you get for being a fanboy. Sure, encourage ATI to put out crap products by buying them and then screw yourself in the process all for the sake of supporting a company. Fanboys are stupid. :roll:
a c 97 à CPUs
a b À AMD
June 14, 2007 5:58:58 PM

Quote:
That's what you get for being a fanboy. Sure, encourage ATI to put out crap products by buying them and then screw yourself in the process all for the sake of supporting a company. Fanboys are stupid. :roll:

Et tu, Brute ???

:lol: 
June 14, 2007 6:11:09 PM

Quote:
That's what you get for being a fanboy. Sure, encourage ATI to put out crap products by buying them and then screw yourself in the process all for the sake of supporting a company. Fanboys are stupid. :roll:


you sure are.
June 14, 2007 6:46:12 PM

Not so. I've used ATI, AMD, Intel, nvidia - all during various times. When I build, I build with the best out there at the time. This time, it just so happened to be Intel and Nvidia. My last machine had Intel, ATI, machine before that, AMD, Nvidia and ATI.

Point is, I would never buy an inferior product from any company just for loyalty/support sake. They have none for me, I have none for them. Company with the best product wins my temporary loyalty.
June 14, 2007 6:49:48 PM

Quote:
That's what you get for being a fanboy. Sure, encourage ATI to put out crap products by buying them and then screw yourself in the process all for the sake of supporting a company. Fanboys are stupid. :roll:

Et tu, Brute ???

:lol: 

Et tu, Brute was used in the context of someone betraying another. Literally, however, it means You too, Brutus? No, not me - see my comments above. If ATI was the superior product, I'd buy it. It's not, and reports from actual users like the OP make it sound like it's not even a good product. For the time being, I'm glad I have what I have.
June 14, 2007 6:53:42 PM

Quote:
Not so. I've used ATI, AMD, Intel, nvidia - all during various times. When I build, I build with the best out there at the time. This time, it just so happened to be Intel and Nvidia. My last machine had Intel, ATI, machine before that, AMD, Nvidia and ATI.

Point is, I would never buy an inferior product from any company just for loyalty/support sake. They have none for me, I have none for them. Company with the best product wins my temporary loyalty.


inferior? so if some chose to buy amd / ati they are buy inferior products why because they don't benchmark as well.
June 14, 2007 7:07:38 PM

In my opinion no one should be blamed for buying an inferior product. Not everyone has the money to buy a high-end product which has the performance crown and cost thousands of $. One must buy the product which offers the biggest bang for the buck for him/her and he is satisfied with it.
Peet
June 14, 2007 7:18:55 PM

Quote:
inferior? so if some chose to buy amd / ati they are buy inferior products why because they don't benchmark as well.


It's not an inferior product if you are surfing the web and checking e-mail. If you are one of the thousands of people playing Counter-Strike 1.6, then AMD ATI is a great product

If you are buying a CPU or Video card for the purpose of getting the best performance out there. Then AMD\ATI isn't where you should be sticking your money, at this exact moment in time. On the high-end side, Intel and Nvidia are the best products for both costs and performance.

In 3-6 months, this might all change. I waited with baited breath to see how the new HD2900Xt was going to do, and now I'm kicking myself for not getting the 8800GTX sooner, as it has reign the roost for a while and it appears it will rule for a while longer.

Like PongRules said, my loyalty last as long as the current generation of hardware I buy. When I go to buy again, I'm out there like a first time shopper, not blindly pouring money into a brand name. I like ATI, I like AMD, I like Intel and I like Nvidia. I own at least 1 product from them all and I've got no complaints with any of them. But when I want the fastest $500 card out there, I buy the fastest $500 card out there. If I want the fastest $200 card out there, I buy the fastest $200 card out there.
June 14, 2007 7:35:23 PM

Quote:
That's why I put less into perf numbers for benchmarks.


Actually, you put less consideration into anything that is negative for AMD and put full consideration into anything that is positive for AMD, to be clear. Let's just say you haven't been considering much lately.
June 14, 2007 7:56:16 PM

Quote:
AMD HAD TO buy ATi for the chipsets and Fusion tech. They could license it or try to hire engrs from them but that might result in patent infringement. If anything the "mistake" was to not push the chipset division to get a reference HT3 chipset out for servers.


We all know that NVIDIA had (and still has) superior chipsets for AMD yet they bought ATI for the chipsets. Yes, they now have their own chipsets but they can't use what is better at the moment. They are now obliged to bundle their own stuff however bad it may be. I am not saying that new ATI chipsets aren't good, but what about SLI, will it work with ATI chipset?

Fusion is something which has yet to take any reasonable shape for end users so it is a long term investment.

In other words you spent a fortune to buy an oxygen mask so you can breathe under water. You got it, but you will get the oxygen bottle in a few days and you are out of oxygen right now.

Quote:
That is one of the reasons they got Toshiba for laptops, they can now provide a whole platform to OEMs, including CPU, GPU, and PCIe.


Yes, but nobody will want to buy ATI graphics in a laptop unless it is better than the current NVIDIA offering which is simply not the case at the moment.

Quote:
They may lose some high end sales but low end is where the volume is. For the longest time Sempron was the most popular chip in retail desktops. Now it's the 4200+ (BE2400?).


As I see it, for AMD owners it was always about the value for money. At the moment Intel offers more value when it comes to CPUs.

Quote:
With the Turion now at 65nm, they can make a whole helluva lot more than they can Opteron 285 or X2 6000+.


Only without SSE4.1 who will buy those chips in the era of HD media content being pushed onto consumers? AMD has only SSE4a which is not even close, and you can have it only with Barcelona.

Quote:
If they can get a high end part on 55nm they can clock several hundred MHz higher at the same TDP.


In theory, but if 65nm and 55nm have higher leakage current, then they will get even worse TDP. TSMC and UMC are not exactly IBM and Intel when it comes to manufacturing process.

Quote:
If I were AMD I would still reserve a little space for Intel chipsets. why not kick em in the nuts?


With recent Intel and NVIDIA platform cross-licensing that wouldn't be very wise thing to do.

At the moment there is no place in the market for ATI chipsets when it comes to Intel. IMO, X38 paired with DDR2 is great value for money especially with Penryn (Wolfdale) just around the corner and fast, low latency DDR2-800 and DDR2-1066 RAM.

Finally, AMD+ATI should be worried because Intel is much more agressive in promoting those new instructions this time. Their latest compiler already supports SSE4 and I heard that even Microsoft will support them in their new Visual Studio which is currently in beta testing. Not to mention lack of MOVNTDQA instruction in Barcelona which will make big difference when it comes to GPGPU. In layman terms, GPGPU will be faster with forthcoming Intel CPU regardless of the GPU used. ATI will thus have to go with Intel CPUs if they want to stay competitive in GPGPU business.
a b à CPUs
June 14, 2007 8:39:42 PM

Quote:
Frankly, I find HardOCP's "benchmarking" to be represent the most realistic use of hardware. I couldn't care less if my 8800GTS can run a game at 370FPS [maxed out] and my 6800nu can run the same thing at 70FPS [medium settings]; both are above my 60FPS threshold. However, the 6800nu can't run the game with the same visual quality as the 8800GTS, and that is a tangible drawback of using the 6800nu to play games.


If both cards are using the same settings (HQ or not), you get an apples to apples comparison. If card A puts out as many or more FPS as card B, we can assume in that game with those settings that card A is at least as good as or better than B. It makes it much more difficult to determine how both cards are performing when you aren't running the same settings for both. It's even possible that if you ran the nVidia card at the same settings as the ATI card, they would have been nearly the same, maybe not, but we'll never know since they didn't run them that way (ahhh my point). I am not disputing their findings, I am disputing their methodology. It's not the first time that HardOCP's benchmark methodology has been called into question on this forum, sometimes it seems they select their benchmarks to suit their agenda. These guys are the only ones to benchmark like this, no one else does.

When you look at the resolutions used by TweakTown, you see they didn't use low end resolutions. They also had an HQ section at the end. Sure the card takes a hit (nothing too drastic), again though you may see this improved with newer drivers.

I am not saying that the X2900XT is a must have either, but I am advocating a wait and see approach. Dooming a card on one review is pretty harsh and (I hate to use the word cause it's over used but here goes), fanboyish. These cards have been available to the public for what, maybe a month. How long did it take nVidia to release a Vista driver that wasn't a steaming pile of camel dung. Seems to me there was talk of a class action suit over that one. Give the driver time. I can understand those who bought the card being disappointed, but two driver releases from now they maybe happier than hell. Early adopters should expect no more. Also anyone who has one that complains about power usage and heat should go fornicate themselves. No one tried to hide that fact and it was well publicized, and that's not something a driver is going to fix.

As for the bit about the 8800GTS 320MB, sorry :oops:  about that, got my wires crossed. A moot point though since the X2900XT performs as well or better than the 8800GTS 640MB in TweakTown's review.

Listen folks it's only hardware, the drama is the same everytime a new product is released (especially GPU's). The lifecycle of the very top end is a short one, often the drivers aren't fully optimized before the new king of the hill comes to reign. So ease up, no one's life is really impacted by all this. If you are one of those who is, it's time to pick a new pastime, you're getting too involved.
June 14, 2007 8:45:57 PM

Quote:
That's why I put less into perf numbers for benchmarks.


Actually, you put less consideration into anything that is negative for AMD and put full consideration into anything that is positive for AMD, to be clear. Let's just say you haven't been considering much lately.

Opinions are like a s s holes! Don't try to twist my words. So you mean I did buy QFX? No I didn't.
June 14, 2007 8:51:59 PM

Quote:
AMD HAD TO buy ATi for the chipsets and Fusion tech. They could license it or try to hire engrs from them but that might result in patent infringement. If anything the "mistake" was to not push the chipset division to get a reference HT3 chipset out for servers.


We all know that NVIDIA had (and still has) superior chipsets for AMD yet they bought ATI for the chipsets. Yes, they now have their own chipsets but they can't use what is better at the moment. They are now obliged to bundle their own stuff however bad it may be. I am not saying that new ATI chipsets aren't good, but what about SLI, will it work with ATI chipset?

Fusion is something which has yet to take any reasonable shape for end users so it is a long term investment.

In other words you spent a fortune to buy an oxygen mask so you can breathe under water. You got it, but you will get the oxygen bottle in a few days and you are out of oxygen right now.

Quote:
That is one of the reasons they got Toshiba for laptops, they can now provide a whole platform to OEMs, including CPU, GPU, and PCIe.


Yes, but nobody will want to buy ATI graphics in a laptop unless it is better than the current NVIDIA offering which is simply not the case at the moment.

Quote:
They may lose some high end sales but low end is where the volume is. For the longest time Sempron was the most popular chip in retail desktops. Now it's the 4200+ (BE2400?).


As I see it, for AMD owners it was always about the value for money. At the moment Intel offers more value when it comes to CPUs.

Quote:
With the Turion now at 65nm, they can make a whole helluva lot more than they can Opteron 285 or X2 6000+.


Only without SSE4.1 who will buy those chips in the era of HD media content being pushed onto consumers? AMD has only SSE4a which is not even close, and you can have it only with Barcelona.

Quote:
If they can get a high end part on 55nm they can clock several hundred MHz higher at the same TDP.


In theory, but if 65nm and 55nm have higher leakage current, then they will get even worse TDP. TSMC and UMC are not exactly IBM and Intel when it comes to manufacturing process.

Quote:
If I were AMD I would still reserve a little space for Intel chipsets. why not kick em in the nuts?


With recent Intel and NVIDIA platform cross-licensing that wouldn't be very wise thing to do.

At the moment there is no place in the market for ATI chipsets when it comes to Intel. IMO, X38 paired with DDR2 is great value for money especially with Penryn (Wolfdale) just around the corner and fast, low latency DDR2-800 and DDR2-1066 RAM.

Finally, AMD+ATI should be worried because Intel is much more agressive in promoting those new instructions this time. Their latest compiler already supports SSE4 and I heard that even Microsoft will support them in their new Visual Studio which is currently in beta testing. Not to mention lack of MOVNTDQA instruction in Barcelona which will make big difference when it comes to GPGPU. In layman terms, GPGPU will be faster with forthcoming Intel CPU regardless of the GPU used. ATI will thus have to go with Intel CPUs if they want to stay competitive in GPGPU business.


WTH are you talking about? Video games don't use SSE and AMD already has a TFLOP machine with R600. People DON'T BUY LAPTOPS for video encoding, they buy them for keeping up with their work out of the office or away from a power outlet.

Even Tom's said after the last price drops that AMD is the better value unless you want the ultimate fastest, which only a small percentage of people do.

If Turion is so bad why is Toshiba putting it in 20% of their laptops?
June 14, 2007 9:39:07 PM

Well, i for one still think that the HD2900xt is a good card. And i think the real problem is in the drivers. If you look at the architecture of the card then you see this card really has something to offer. But its all new, so amd needs some time to figure things out.
And there comparing a new card with newly released drivers with a card wich is out for half a year with good developed drivers. In other words, the 8800 is hitting its end with the drivers sooner than the 2900 and the 2900 is a bit more complex.

I think that over time the 2900 will be a really nice card.. but thats what i think :p  it could be just a flop too. But we'll see :) 
June 14, 2007 9:39:18 PM

Quote:
well it's easy to be smart after you've done something. :roll:


Oh I get it ... in late 2006 ATI had no inkling at all that the R600 wasnt going to be competitive with Nvidia :roll:

AMD didnt do any sort of due diligence checking and hence let down their shareholders.

This is what is known in psychology as the hindsight bias.

"Oh ATI bombed, well of course, it was obvious all along!!"

AMD took a gamble, short of putting money into their own intelligence / infiltration department to spy on ATI's status of R600, there was nothing they could do to see any huge impending failures by ATI.

I remember a couple of months ago everyone was super hyped about R600, how it would retake the throne of high end graphics and that Nvidia was stupid to release the 8800 series so early because it would be made totally obsolete by the time ATI released R600.

Now here we are, I still think AMD made the right move since they can make their own platforms without relying completely on third party support. Yes it will be a rough time for AMD / ATI for a couple of years but I think eventually it will pay off. For example: the Fusion concept, better chipsets for AMD platforms, technologies shared between the two companies could enhance both of their abilities and both companies being able to spread out to different markets.

Or AMD could just die, taking ATI with it, and we're all screwed. I don't have a crystal ball.
June 14, 2007 9:49:59 PM

Quote:
Quote:
As for the bit about the 8800GTS 320MB, sorry :oops:  about that, got my wires crossed. A moot point though since the X2900XT performs as well or better than the 8800GTS 640MB in TweakTown's review.

Listen folks it's only hardware, the drama is the same everytime a new product is released (especially GPU's). The lifecycle of the very top end is a short one, often the drivers aren't fully optimized before the new king of the hill comes to reign. So ease up, no one's life is really impacted by all this. If you are one of those who is, it's time to pick a new pastime, you're getting too involved.


Agree with the last paragraph, but while the x2900xt performs as well or better than the 8800GTS, it costs up to $70 more. Not such a great deal, considering. And in some cases, the 2900 fared worse than their own x1950xtx. Ouch. Problems with AA at high resolution, not so great drivers. I'd call that pretty inferior.

The OP is a gamer, and probably did his homework and went with the 2900xt regardless of how it compared to Nvidia's equal, the 8800 GTS 640 and regardless of the price being higher. That's fanboyism at its most stupid.

This was ATI's best shot, and as an ATI fan (not fanboy), I think it fell short. ATI has had great cards, and I've owned them. But now ATI is getting owned and they better have something up their sleeve.
June 14, 2007 11:31:37 PM

This is a reply to all my Bud's here...
It's me again, I love reading all these posts and reviews year after year for 8-years now... After all though, all I have gotten out of this is that AMD/ATI have helped to keep the two better Co's headed in the right direction.

I have built my own from 97 to date and they preformed without falt, meaning they were worth every penny (exept that one ATI card I bought in 99. I did get a bundled copy of SCGT with it and its been on every one of my game machines so far, thanks for that ATI), and every machine has had Intel and Nvidia-post 3DFX- "Inside" since.
I cant wait to build the next Intel/Nvidia machine - it will be impressive and last years thankyou.
I would buy AMD/ATI but I cant find any reason...
If this is "FANBOY" then I would have to thank the better Co's.
Looks like I'll be a FANBOY for the next few years!
Keep it UP!
June 15, 2007 12:59:01 AM

Quote:
WTH are you talking about? Video games don't use SSE and AMD already has a TFLOP machine with R600. People DON'T BUY LAPTOPS for video encoding, they buy them for keeping up with their work out of the office or away from a power outlet.


1. Modern video games use SSE and so do video drivers.
2. TFLOP or not, GPGPU is going to suffer without MOVNTDQA.
3. You need decent CPU for smooth decoding of HD video as well

Quote:
Even Tom's said after the last price drops that AMD is the better value unless you want the ultimate fastest, which only a small percentage of people do.


It is perhaps a bit cheaper if you are buying a new system BUT what about the upgrade for all those S939 systems that many enthusiasts already have? Simple -- it doesn't exist because S939 is EOL. You have to buy new mainboard and RAM.

Why would someone (sane) get AM2 to replace S939 when they can get latest Intel chipset which supports PCIE2.0, DDR3 and future batch of CPUs which will have considerably better performance and higher clock speeds? Doesn't make any sense unless you are severely restricted in budget or loyal to a brand (some people call those fanboys).

Quote:
If Turion is so bad why is Toshiba putting it in 20% of their laptops?


Give me one solid reason why anyone would want to buy technology from 2003 in late 2007 / early 2008 especially after Penryn based laptops become available?

Griffin is still the same old K8 core with only improvements being new HT and memory controller. Avoiding to admit that AMD is not making real architectural progress is plain stupid.
June 15, 2007 1:07:21 AM

Quote:
AMD took a gamble, short of putting money into their own intelligence / infiltration department to spy on ATI's status of R600, there was nothing they could do to see any huge impending failures by ATI.


Well I admit that my grandmother perhaps didn't know about ATI's troubles, but it was publicly well known that they missed several important product launch windows and it was dead obvious they have problems both in manufacturing and with their drivers. So please explain why would AMD (or anyone for that matter) need spies for gathering that sort of "intelligence"?!?
June 15, 2007 2:16:05 AM

Quote:
AMD took a gamble, short of putting money into their own intelligence / infiltration department to spy on ATI's status of R600, there was nothing they could do to see any huge impending failures by ATI.


Well I admit that my grandmother perhaps didn't know about ATI's troubles, but it was publicly well known that they missed several important product launch windows and it was dead obvious they have problems both in manufacturing and with their drivers. So please explain why would AMD (or anyone for that matter) need spies for gathering that sort of "intelligence"?!?

Delays happen, it doesn't necessarily mean that they had problems with manufacturing and drivers. And even so if the R600 and their other products turned out to be an enormous success after all of the troubleshooting, delays, etc. it wouldn't matter.

I guess I missed your thread several months before the AMD / ATI merger of all of the benchmarks of R600 showing how it couldn't compete with Nvidia. After all, it was public knowledge according to you :roll:
June 15, 2007 2:55:50 AM

basically, the two cards are the same performance wise. ati wins a few, nvidia wins a few. Sit a professional gamer infront of two biege boxes and they won't be able to tell you which is the better card, without resorting to benchmarking. There's no way they could figure it out going into a setup like that, blind.. although some of these fucking fags on the circuirt will argue otherwise, it's just pigs flying out of their asses.

The ATI card cost $120 more.

No reason for the consumer to buy the ati card.
June 15, 2007 3:06:09 AM

Quote:

From what's being said after Computex, ATi will do a good volume business and maybe actually turn a profit. If I were AMD I would still reserve a little space for Intel chipsets. why not kick em in the nuts? All's fair in love and war.


Turn a profit..? Yes, that would be nice. When exactly will this be?
June 15, 2007 3:10:20 AM

How soon you all forgot about the good old Nvidia FX series of cards.

Ever hear the saying S$%t happens. :wink:

Give me a break....please... from these meaningless posts.... :evil: 
!