Sign in with
Sign up | Sign in
Your question

Nvidia vs. AMD/Ati

Last response: in Graphics & Displays
Share
March 1, 2009 9:36:33 PM

With AMDs latest motherboards bein released to primarily support crossfire, I've been takin a look at Ati cards for possible future purchase. I've seen that they do have an answer to Nvidia's CUDA, as well as physics rendering (at least in crossfire). When it comes down to finer detail, I've been pretty skeptical about Ati bein more of a powerhog over Nvidia, as well as Ati cards running hotter than Nvidia cards. I'm lookin at the 4800 series from Ati when making these assumptions, but I haven't been an Ati user for several years. I'd like to know if this is still the case, because I don't want to buy a 900W power supply for crossfiring 4870s together if I could just go back to nvidia and have 750 watts to sli gtx 260s (core 216s).

So the core question is basically... which cards are more power efficient and contain the best features on them (cuda vs. ati stream, physx vs. crossfire physics, etc)?

I'm kinda stumped and I know this is a decently broad question, but all responses are welcome :) 

More about : nvidia amd ati

March 1, 2009 10:11:29 PM

nvidia and ati have been fighting this out for a while and the result has been 10 - 15 models each that have similar computing power to the opposing card in their 'rank'. while the exact benefits for each card may be debatable, the cheeper one is ATI.
also look here:
http://www.tomshardware.com/reviews/geforce-gtx-radeon,...
March 1, 2009 10:17:21 PM

fractalfx said:
When it comes down to finer detail, I've been pretty skeptical about Ati bein more of a powerhog over Nvidia, as well as Ati cards running hotter than Nvidia cards. I'm lookin at the 4800 series from Ati when making these assumptions, but I haven't been an Ati user for several years. I'd like to know if this is still the case, because I don't want to buy a 900W power supply for crossfiring 4870s together if I could just go back to nvidia and have 750 watts to sli gtx 260s (core 216s). :) 


ATI chipsets tend to take up more power, but you shouldn't need a 900W psu. a 4870 crossfired should take a reasonable 650w psu.


Related resources
a b U Graphics card
a b Î Nvidia
March 2, 2009 3:11:41 AM

ATI and NVidia are pretty close on power consumption right now. NVidia actually was much worse before the die shrink.

Also, some ATI chipsets are extremely power efficient.
March 2, 2009 3:37:56 AM

GTA260(216 core) is good and also bulkier than HD4870. And is costlier than HD4870. If you are going for SLi/CF, i would say better get a single HD4870X2 now and later if you need another, get one more. 2 x 9800GTX+ SLi configuration, is also a good option. The most powerful single GPU available now is GTX285. If it is o.k to get one, then go for it. I've heard that, latest GPU's draw less power compared to older ones(may be because of die size).
a b U Graphics card
a b Î Nvidia
March 2, 2009 5:40:09 AM

Power difference is minor, the difference is a few watts, and it's like asking which is more fuel efficient a Ferrari or a Lambo as if that's the main issue.

Whatever power supply requirements EITHER had will be sufficient for the other.

As already mentioned, the 65nm version of the GTX260 is more power hoggish, the switch to 55nm changed that in the other direction. So depending on which once you're looking at buying, it might require more power to run that SLi versus the Xfire.

As for the heat, it's the same situation, the HD4850 was hot due to a poor cooler, but the HD4870 was in line with the competition (some better [like HIS] some worse), and the change to 55nm resulted in some poor results for some people due to a cutback on HSF assembly quality, so even that's not a guaranteed benefit in that area. The smart users who have some skillz replace the thermal grease and ensure the best quality HSF to start and get great results on both.

Overall their pretty close especially if you take the time to maximize your setup.
March 2, 2009 6:11:46 AM

GrapeApe must've broken a leg... Or did climate change already melt all the snow in Canada???
a b U Graphics card
a b Î Nvidia
March 2, 2009 6:17:21 AM

Nah just got back a few hours ago, and now off to bed before earning the money so I can ski again next weekend. :sol: 
March 2, 2009 6:30:10 AM

Maybe you should visit AUS / NZ next July...

G'nite...
a b U Graphics card
a b Î Nvidia
March 2, 2009 6:37:45 AM

Yes, 2010 summer, bask in post-Olympic good will towards Canucks !! [:thegreatgrapeape:3]
Also Argentina and Chile are on the Agenda.

G'day, eh! ;) 
March 2, 2009 8:45:33 AM

TheGreatGrapeApe said:
As for the heat, it's the same situation, the HD4850 was hot due to a poor cooler, but the HD4870 was in line with the competition (some better [like HIS] some worse)...


HD4850 with good cooling are available. HIS ICEQ4 and MSI R4850 are known for their cooling facility.

March 2, 2009 8:52:51 PM

TheGreatGrapeApe said:
Power difference is minor, the difference is a few watts, and it's like asking which is more fuel efficient a Ferrari or a Lambo as if that's the main issue.


yup, if your looking into cards like this, dont worry about efficiency. efficiency is not the thing you should be worried about. be more concerned about power, driver quality, price, ect...

personally i prefer ati because they compete with nvidia with cards that have similar performance at a lower price
example:
4870 ( ~$170) VS. gx 280(~330)
similar power, the 4870 kills with the price

March 2, 2009 10:26:08 PM

how important are the stream processors when deciding? nvidia always has less so i don't know if its a good or bad thing
a b U Graphics card
a b Î Nvidia
March 2, 2009 10:43:14 PM

It's not majorly important unless you specifically know how to use them.

Remember that while there are less, they work at a higher speed, so they are close to each other in raw math. Unless you are using a Brook+ or Cal app, then for general computing they'll usually line up very similar, and often with the nVidia cards going out front, because of how the operations are usually run when not optimized for the architecture. This is really only a concern for GPGPU apps that are coded for this, even Permier is primarily an OpenGL application layer, not a true GPGPU app using raw stream processing power.

For gaming though it doesn't really matter, most of the time the games are Texture and ROP limited, one of the few games that does show the difference is GRiD, but it is the rarest of exceptions where raw shader / SPU power/count matters.

As much as GRiD may be indicative of future titles, it's not indicative of the majority of 2009 titles, so yippee, it'll matter most when you can by a mid-range DX11 card that outperforms either option. Big deal.
March 2, 2009 11:29:55 PM

What is amusing about that is, the i7 is in a similar position in the cpu market concerning games.
!