Sign in with
Sign up | Sign in
Your question

DirectX 10 Shootout: Nvidia vs. ATI

Last response: in Graphics & Displays
Share
October 9, 2007 10:46:51 AM

It's truth time. Now that the first games using DirectX 10 have appeared, Nvidia and ATI have to face the real world and answer the question - which card offers the most performance under Vista? We look at Geforce 8x00 vs. Radeon 2x00 cards.

http://www.tomshardware.com/2007/10/09/directx_10_shootout/index.html
October 9, 2007 11:14:42 AM

**pours 40-ounce over ATI Tombstone **
October 9, 2007 11:42:01 AM

Wow im both happy and sad after seeing that.
Happy that my "puny" 320 shows real good performance compared to "much better" cards.
And here i heard all week that my 320 is so much worse then the new Pro.

Sad cause i don't own a GTX or better...
Related resources
October 9, 2007 12:25:20 PM

In case anyone is wondering, the 320MB 8800gts is still haunted by that memory bug in dx10 (163.69 fixed it for dx9). That is why the CoH and WiC scores are so low. The 2400PRO HM beats it in CoH for God's sake. Under dx9 it performs MUCH better. Hopefully Nvidia is still planning to fix this.
October 9, 2007 12:54:18 PM

Glad I did not wait to buy my card... I am enjoying it today instead of waiting for tomorrow...
October 9, 2007 1:04:37 PM

Well, I am ordering my 8800GTS tomorrow. the 2900 cards are okay.... but they just don't seem worth it. Shoot, you can't even find a 2900Pro at the current time.
a c 130 U Graphics card
a b Î Nvidia
October 9, 2007 1:11:05 PM

Brilliant review much better than the budget DX10 one,AMD are up against it big time their new GPU better cream the Nvidea one or its bad news all round.

Going on the review i cant make a case for buying anthing other than a GTX and upwards (or sideways depending on how the new cards fair).
I hope they run the new cards through this set of benchmarks when they come out,and quickly to(im wondering if toms are still getting sent ref cards to test)seems to me that just latley you have to go to a different site to find a review of any newer cards or is it just me?
Would be nice to see them under DX9/XP as well as a lot of peoplwe still run it.
Mactronix
October 9, 2007 1:13:41 PM

I wish the 2900pro would have been included in this review.
October 9, 2007 1:19:07 PM

Am I the only one here that is shocked to see how much all of these cards blow with DX10? Do you remember all of the claims that not only was DX10 prettier but faster as well? Well we do we see these DX10 cards run faster than DX9 mode???? Give me a break...
October 9, 2007 1:19:57 PM

Am I the only one here that is shocked to see how much all of these cards blow with DX10? Do you remember all of the claims that not only was DX10 prettier but faster as well? Well we do we see these DX10 cards run faster than DX9 mode???? Give me a break...
October 9, 2007 1:21:00 PM

700$ + for a card that cant run all games at highest resolution and max settings... wow
October 9, 2007 1:28:12 PM

Just wait for the D8E
a b U Graphics card
a b Î Nvidia
October 9, 2007 1:31:12 PM

prodystopian said:
I wish the 2900pro would have been included in this review.
Using the 2900XT as a benchmark for an OC'd 2900Pro seems reasonable. A stock 2900Pro would under-perform a 2900XT in every way.

Overall the results are pretty shocking.

a c 130 U Graphics card
a b Î Nvidia
October 9, 2007 2:01:44 PM

This is getting serious guys if Nvidea dont get some competition soon we will be paying way to much for a decent GPU its borderline at the minuite but if things dont improve soon as far as performance for the money is concerned i will be seriously concidering getting a dedicated gaming console and just use the pc for web surfing and office/HTPC applications. Yes i know its heracy but what you gonna do?
Mactronix
October 9, 2007 2:03:50 PM

in company of heroes... 8600GTS performs better than 8800GTS... strange
a b U Graphics card
October 9, 2007 2:14:09 PM

even the non gts performed better
October 9, 2007 2:16:18 PM

All of those games tested are Nvidia optimised, so it's unfair to write ATI off.
October 9, 2007 2:34:39 PM

prodystopian said:
I wish the 2900pro would have been included in this review.


I wish the HD 2900 XT with 1 gig of GDDR4 memory would have been included. The reviewer could find, buy, and test eleven Nvidia cards, but only five ATI cards? And then there's the fact that some of the Nvidia cards were overclocked models, while none of the ATI cards were overclocked. I mean, come on. I own a 8800 GTS 640 myself, but I'd like to see more balance in the testing. If the deck is stacked against ATI, then the outcome is assured and while the information itself may be good, it is incomplete and leaves me not knowing if the card I have is the best for the money or not.
October 9, 2007 2:49:29 PM

I think I will stay with DX9 because of the 50-100% more performance, in Company Of Heroes theres very little graphics difference between DX9 & DX10 except for a crap load of performance drop under DX10. Buying a 24" monitor now seems like a bad idea after seeing the fps under Vista/DX10, DX9 is another story. Think I'll hold on to my 19" LCD for now.
October 9, 2007 3:03:16 PM

I still reckon the 2900 pro is better value than 8800 320 if u dont mind OCing
October 9, 2007 3:19:34 PM

I have a GTX and I'm glad I didn't wait to purchase the much heralded (by ATI fans anyway) 2900XT.

I don't see why the GTX won't run Crysis in all its DX10 glory. There appears to be ample overhead for next gen games.

I do agree that the 2900XT with 1 gig of GDDR4 memory should have been included, but it wouldn't have changed my opinion about the current line of ATI cards.

Nvidia rules the roost right now. And FPS/$ is a nice way of showing it.
October 9, 2007 3:21:35 PM

I'm happy I didn't buy ANY of those cards.
Hopefully the refresh (November-Feb 08) will uncork the power.
If not and there is no power to uncork, it's at least another year until the next major release.
Wake me up when something happens. :sleep: 
October 9, 2007 3:24:53 PM

ap90033 said:
Am I the only one here that is shocked to see how much all of these cards blow with DX10? Do you remember all of the claims that not only was DX10 prettier but faster as well? Well we do we see these DX10 cards run faster than DX9 mode???? Give me a break...

Games with the same settings should run 10-15% faster in Vista than DX9 in XP.
Something is very wrong here. I guess the games tested are very NOT native DX10.
October 9, 2007 3:28:26 PM

basically, wait till crysis, which is native dx10 so far as I can tell (or at least, it wasn't an add on after they had finished the game)
October 9, 2007 3:28:46 PM

Taking a further look at the GPU frequencies of the tested cards, I see that the 2900 Xt that was tested was 743 mhz, whereas I note on Newegg, the top cards run at 800 mhz for the 512mb card and 825 mhz for the 1 gig card. That makes me wonder if a ATI card with a faster speed would have done better? Don't know and can't say for sure, but it leaves me wondering.

This does make a difference for me as I'm ordering parts for a new Vista machine build and I would like good information on which card is best, or best for the money. I run Nvidia cards on three of my four computers, but I'm not totally stuck on Nividia, but instead like a balance of performance for money spent.
October 9, 2007 4:04:53 PM

I can't believe how much ATI gets slaughtered here...I don't think using the 800-825MHz cards would have made enough of a difference over the 743MHz one...helped yes, but still well below the ultra.
That's really embarrassing IMO for DAAMIT. There's no way I will buy a 2900 now!

I DO wonder what the hell they (DAAMIT) are doing. If AMD tanks and takes ATI with it, what a joke it will be. AMD was the worst thing to happen to ATI. If the RV670/680 fares no better, I will be buying NVIDIA for sure...until I can afford a crossfire setup for my new ASUS X38.
October 9, 2007 4:23:17 PM

speedbird said:
All of those games tested are Nvidia optimised, so it's unfair to write ATI off.
You can't find any new games that aren't part of Nvidia's "TWIMTBP" program. Also, it doesn't guarantee the game will run better on Nvidia hardware. Look at Oblivion; it came out as a "TWIMTBP" title and ran considerably better on ATI hardware.
October 9, 2007 4:23:38 PM

True, a 825 mhz 2900 may not have helped that much, but it would have been interesting to see how a 825 mhz card with 1 gig of DDR4 memory would have faired. Its also not just the mhz alone, but the DDR4 as opposed to the DDR3. But it should have helped, and maybe the ATI cards wouldn't have been quite as far behind. But we don't know. Having good information makes for good choices. Having incomplete information makes for guesses. I'd rather have good information than make guesses with my money.
October 9, 2007 4:31:49 PM

Atm DX10 is more of a super high quality level then what games are based on, you arent going to run every dx10 game in future on high are you, dont run every dx9 game on high.

Just wait until some proper dx10 games come out.
October 9, 2007 4:32:43 PM

I was wondering why they included so many oc'd nvida cards.
If you look at the graphs as a whole,you'll see all this green and all the red on the bottom. :ouch: 
ATI whouldn't have looked so bad if they go rid of the extra nvidias
October 9, 2007 4:41:17 PM

sailer said:
True, a 825 mhz 2900 may not have helped that much, but it would have been interesting to see how a 825 mhz card with 1 gig of DDR4 memory would have faired. Its also not just the mhz alone, but the DDR4 as opposed to the DDR3. But it should have helped, and maybe the ATI cards wouldn't have been quite as far behind. But we don't know. Having good information makes for good choices. Having incomplete information makes for guesses. I'd rather have good information than make guesses with my money.


I agree with you. In principal it is always better to have a more complete representation of both players. I am actually more concerned with the low clocked card they used... most other benches I have seen do not show a big diff (if any at all) between the 1 gigger and the vanilla, but that is an 800 vs 825 mhz card... only a 25 mhz diff... NOT a 75mhz+ diff. And are there NO oc'd ati cards on the market? (that is a legit question, I thought there would be a few to be able to grab at least one...)

Other sites show similar issues though with other ati cards... the AA in WiC tanks it, but there is a patch for the game coming out (may already be out) that is supposed to fix that, so it may not be an ati driver thing but a game thing. (other games don't tank the card w/ AA like that one does) If I remember correctly Nv has been pushing their non-dx10 method of (old-school) AA and most twimtbp games optimize for it where ati is "proper" dx10 implementation... but that could be bogus memory on my part. Proper or not, the fact that a game patch is "fixing" it points at them not having it implemented correctly for the hardware.

Also, if you look at res above 1280 (and who runs under that after buying an 8800 or 2900?) you see many matches having the 2900 take the win over the 8800. Why do we even still have a 1024 res on these tests? Is there anyone still gaming at that size?! (unless you have 320 or less memory and you HAVE to) IMO that just removes frame buffer size as the large factor it is with AA and high resolutions and "equalizes" the smaller vram cards with their larger cousins. If you remove the 1024 from all calculations I would wager that the numbers would balance the 8800 640 and the 2900 much closer... maybe... probably... meh, I dont have time to check that, maybe I will tonight.

regardless, was a good article to combine with info from other sites.
October 9, 2007 5:28:50 PM

spuddyt said:
basically, wait till crysis, which is native dx10 so far as I can tell (or at least, it wasn't an add on after they had finished the game)

I hope your're right.
October 9, 2007 5:30:05 PM

This article also proves that gaming in Vista sucks arse right now and isn't worth a squirt of urine over XP. Give it about 2 years for Microsoft to fix their mistakes, for hardware (mainly GPUs) to catch up to the bloated OS and for mainstream DX10 games to be available.

Then "all of the sudden" gaming in Vista will be the best way to go.
October 9, 2007 5:45:20 PM

I notice a lot of people shoot the HD2900Pro down very fast and its strange why. Its the same core as a HD2900XT only underclocked which means you can OC it to the XT speeds and beyond very easily.

I have the 1GB version of the HD2900Pro and its GPU is 601MHz memory 925MHz and it cost me $319.99. A 1GB HD2900XT would cost $499.99 at some places so its almost a $200 dollar price difference. All you have to do is OC it to the XT version speeds and I am sure you would see a huge difference.

I did a performance test using Lost Planets demo and it got me roughly 44 on the first map and 48 on the second at stock speeds. When I played the demo it was very smooth and it never slowed down.

I think we need to get some of the HD2900Pro cards in the labs and test them at stock and OC'ed to see what they can offer especially since they are cheap compared to a lot. Plus I dont see any reason to spend more than $350 on a video card let alone $700.
October 9, 2007 6:00:37 PM

ap90033 said:
Am I the only one here that is shocked to see how much all of these cards blow with DX10? Do you remember all of the claims that not only was DX10 prettier but faster as well? Well we do we see these DX10 cards run faster than DX9 mode???? Give me a break...

Ahhh... you were expecting to have your cake AND eat it too? Surely you know better than that by now! Just think about Windows... every time a new version comes out, Microsoft tells us how much faster it is... and in reality, that's NEVER true. I'm sure I could dig up an article/press release about how much faster Vista is than XP... but what's the point?
October 9, 2007 6:08:12 PM

The reason they have more Nvidia cards then ATI cards is most likely because the companies making the Nvidia cards have sent them more cards then the ATI companies.
Most review sites can't afford to buy a lot of video cards (and most of what they review) and are dependant on companies donating the cards. Considering that for the most part the Nvidia cards have been around a lot longer they (THG) has had a lot longer to accumulate more and different cards.


As for showing benchmarks in 1024, yes people still run games at that resolution. Probably a fair number of people. 1280 would also be very common, you don't get most users above that most of the time. Sure people that are spending the money for the 8800 Ultra or big SLI configurations, but that is a very small minority of users. Considering that the 8800s don't even fall into the mid-range price range and mid-range basically defines what most people are using, the use of lower resolutions is highly relivent. Also when you consider that they are looking at cards like the 8600/2600s and lower then 1024 is very relivent.
October 9, 2007 6:19:20 PM

rodney_ws said:
Ahhh... you were expecting to have your cake AND eat it too? Surely you know better than that by now! Just think about Windows... every time a new version comes out, Microsoft tells us how much faster it is... and in reality, that's NEVER true. I'm sure I could dig up an article/press release about how much faster Vista is than XP... but what's the point?



Expecting, no not for a second... Surprised no one seems to be calling Microshaft on it, uhm yeah.
October 9, 2007 7:11:37 PM

The ATI HD series is a joke, it reminds me alot of the old Geforce FX series. I hope that the next ATI batch of GPUs kick a bit more ass.
October 9, 2007 7:24:59 PM

I honestly think there is some serious nvidia bias to this article. It's common knowledge that statistics can be manipulated to argue any point I think that is the case here. They did not use ATI's top card they used a lower end version of it. They compared 6 different versions of Nvidias 8800 and 1 of the 2900. I was unaware that the games were optimised for nvidia but even before that I could tell by the writing that this article was going to lean in favor of nvidia. Basically the first page said "haha we told you so and here's the proof". Further in the article the op jumps to the defense of nvidia blaming the game mechanics relying heavily on memory (this seemed like a very accurate analysis) but still sounded bias.

I'm a proud owner of an 8800 GTS 320 MB version and it pleases me to see my money was well spent however I often depend on Toms Hardware to make my comparisons as I could careless about ati vs. nvidia or amd vs. intel (I want the most bang for my buck). And with articles like this I lose faith in achieving a fair comparison from this site.

October 9, 2007 7:25:30 PM

arafay1 said:
in company of heroes... 8600GTS performs better than 8800GTS... strange

homerdog said:
In case anyone is wondering, the 320MB 8800gts is still haunted by that memory bug in dx10 (163.69 fixed it for dx9). That is why the CoH and WiC scores are so low. The 2400PRO HM beats it in CoH for God's sake. Under dx9 it performs MUCH better. Hopefully Nvidia is still planning to fix this.

;) 
October 9, 2007 7:48:31 PM

deathblooms2k1 said:
I honestly think there is some serious nvidia bias to this article...

I thought I picked up that to, otherwise I would have thought (proove me wrong here, by all means) they would have a 2900 pro in it to compare to the 320
October 9, 2007 7:51:35 PM

I'm a nvidia supporter but this test is rigged BAD TIME.
October 9, 2007 8:08:22 PM

homerdog said:
;)


Memory Bug? I know there is some kind of 'bug' in some games like QW where the vid card refuses to use virtual memory, which is why framerates drop more than they should at 16 AA
October 9, 2007 8:31:44 PM

AMDZone also had trouble getting new ATI cards as well so there's been this "unwillingness" to donate cards for review. Hopefully the HD2950 will go in a better direction than the current 2x00 series.
October 9, 2007 8:38:09 PM

I would also agree that this article comes across bias. Overclocked Nvidia cards and hand picked Nvidia hardware favoured games. My view is wait for other DX10 games to appear until writing ATI's offering off.
October 9, 2007 8:44:31 PM

When developers started creating software for DX10, they used the only available DX10 compliant hardware available, namely Nvidia hardware.

It makes sense that the first DX10 software available would run better on Nvidia hardware since that's what the software was being developed on.

Am I out of whack here?
a b U Graphics card
October 9, 2007 8:52:08 PM

From the article leadin before the test:

"ATI simply needed more time - after all, Nvidia had six months to tweak its graphics drivers. Given enough time, ATI's drivers would be bound to improve, giving the Radeon 2900 XT the much-anticipated performance boost."

That sounds like a ready made excuse for ATI's lack of a better showing. nVidia has had six months or so to tweak drivers. And considering the memory bug in the 640 MB 8800GTS mentioned above, they are not done yet. But remember, the 2900 was supposed to ship concurrently with 8800's last winter. So ATI has had 6 months to tweak the hardware.
October 9, 2007 8:58:07 PM

I understand the reasoning for the nvidia based games, not much you could do differently there. As previously stated nvidia was first with the technology so games were made by their standards to start.

With all that aside this article would still come off bias to me. Just by how it was written and the choices of cards to compare. Just a week or two ago an article was made saying that a 2900 pro would give you more bang for your buck then an 8800 gts. At that point I wanted to see THG benchmarks to prove it to see whether or not my 8800 was a worse investment. And this article doesn't even compare the two. It just doesn't make sense to me that they would compare 6 high end cards to 1 ATI. They put in over clocked nvidia cards and fail to front even the stock high end version of the ATI card.
October 9, 2007 9:02:30 PM

It seems to me that the reasons why the software is Nvidia tweaked is because there are really not a whole lot of ATI-tweaked games coming out right now. Maybe someone else can name a few mainstream titles if I'm wrong (wouldn't be the first time, nor the last). Considering that ATI took an extra 6 months to bring out the 2900, why would any software developers looking to get bleeding-edge performance write code for out-dated hardware?
Oh, and about the OC'd hardware- It's a good and relevant point to show that Tom's is comparing apples to oranges there. HOWEVER, if you ignore the OC'd versions, the price/framerate comparison is still valid.
How can anyone argue with the fact that the 2XXX series still can't handle any sort of AA/AF without dropping 50% on their framerates? SAD!
I'm still holding out hope that ATI still has some driver tweaks left up their sleeves for DX10 that will help improve performance. Nvidia needs some real competition on the top-end to help drag those ridiculously high prices down. Hell, I got my 8800gts640 for less than it sells today back in January for Godssake!!!
October 9, 2007 9:09:57 PM

"ATI simply needed more time - after all, Nvidia had six months to tweak its graphics drivers. Given enough time, ATI's drivers would be bound to improve, giving the Radeon 2900 XT the much-anticipated performance boost."

They had 6 months to tweak the hardware then now about 5 months to tweak the drivers. If this test isn't rigged then ATI better release a BEAST on their new releases or else ATI goes down the NVIDIA starts killing consumers with price raises
!