Any ATI card vs NVIDIA 400 Series

G

Guest

Guest
Is there anything I could buy that could compare to the nvidia 480? I'd be willing to 3-way crossfire it, as long as its in the same price range as the 480 (~$550).... Or would I be better off waiting the 480
 

+1^
 

mteeple

Distinguished
Mar 27, 2010
45
0
18,530
You wouldn't even to have three cards of any of the newer gen stuff to beat 480... 5870 is right around the ballpark by the 480 just a little less performance sells for about 400 dollars... 5850 sells low 300, could crossfire and it would easily beat 480... two 5770 will beat the 480 as well, and it would only be about 320, although on some games it maybe weaker due to stuttering
 
The new NVidia cards have very poor power consumption. For example, the 480 in most games gives about 10 to 15% average FPS than an HD5870 but compare power:

Idle:
480 (59W)
5870 (27W)
Under load, the 480 uses 140W MORE POWER than the HD5870!

The 480 is a very interesting architecture though and we can expect some specific boosts in titles that utilize the areas that the 480 is much better in like Tesselation. Considering it will be a while before games really take full use of this we have to ask what the Pros and Cons are.

I WANT to go NVidia because I'm a little concerned over PhysX support and game optimizations like we've seen and maybe video transcoding apps but I can't get around the power/noise problems. I'm going to hold off for several months.

One obvious solution is for NVidia to get Optimus working for the desktop to allow the graphics cards to turn off COMPLETELY when not needed. We need a new motherboard for this.
 

rofl_my_waffle

Distinguished
Feb 20, 2010
972
0
19,160
Id advise against the 480. Power, noise, and heat can be a deal breaker but you won't know until you get the card. I had some really really loud cards in the past, it only makes you crazy. You can never get used to insane loud cards, you only wish every moment was spent with something more quiet.

As for power consumption, I wouldn't care since electricity is so cheap except that power is converted into heat. My room is like 5C higher than ambient when my computer is on now, its not something you think about when buying a computer. I got a liquid cooling system so my processor can OC without breaking a sweat, but apparently I can't do the same. I had to scale back my i7 to 3.8Ghz to get lower voltages. At least the liquid cooling wasn't a complete waste since I cool my 5970 with it.

Speaking of which. If you can get a 5970 with good cooling. You can overclock beyond 5870 levels. I could get mine a little over 1000mhz core and 1250mhz memory. But again I scaled back to 950 core and 1200 mem to get lower voltage to reduce heat. Don't want to open my window all the time. Definitely don't want a 480 heating up my room.

PhysX has negligible impact to your game. Its not like every game uses physx or physx is the only physics engine out there. Havok is a more popular physics engine than nvidia physx. Online game will probably never use physx because it excludes everyone without an Nvidia card and more importantly it excludes all servers without an Nvidia card. While there are other perfectly good CPU based physics solutions out there. Bad company 2 for example uses the Havok CPU based physics engine which is by now more powerful than nvidia physx.

Nvidia won't get optimus working for desktops on the 480. Maybe the next generation card release. It isn't a driver level charge. It is a physical hardware limitation. First off the only intergrated graphics solution for desktops is the ATI boards. Don't know how nvidia is going to collaborate with ATI but id imagine it will never happen. Second; unlike laptops, the intergrated graphics and dedicated graphics card have their own output ports. They would need to share one in order to swap between each other. So unless there is a hardware level change. It isn't going to happen. Its not like the 480 cards is even that bad on idle compared to when they are on load. They should fix the architechture and make the chip cooler. If the rated power consumption was 200W and didn't hit 94C without even overclocking. Then I might even think of getting a pair or a dual GPU version.